CN110859352A - AR fire helmet based on distributed network - Google Patents

AR fire helmet based on distributed network Download PDF

Info

Publication number
CN110859352A
CN110859352A CN201911097985.3A CN201911097985A CN110859352A CN 110859352 A CN110859352 A CN 110859352A CN 201911097985 A CN201911097985 A CN 201911097985A CN 110859352 A CN110859352 A CN 110859352A
Authority
CN
China
Prior art keywords
positioning
fire
unit
control unit
inertial navigation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911097985.3A
Other languages
Chinese (zh)
Inventor
燕强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shaanxi Henning Electronic Technology Co Ltd
Original Assignee
Shaanxi Henning Electronic Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shaanxi Henning Electronic Technology Co Ltd filed Critical Shaanxi Henning Electronic Technology Co Ltd
Priority to CN201911097985.3A priority Critical patent/CN110859352A/en
Publication of CN110859352A publication Critical patent/CN110859352A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A42HEADWEAR
    • A42BHATS; HEAD COVERINGS
    • A42B3/00Helmets; Helmet covers ; Other protective head coverings
    • A42B3/04Parts, details or accessories of helmets
    • A42B3/30Mounting radio sets or communication systems
    • AHUMAN NECESSITIES
    • A42HEADWEAR
    • A42BHATS; HEAD COVERINGS
    • A42B3/00Helmets; Helmet covers ; Other protective head coverings
    • A42B3/04Parts, details or accessories of helmets
    • A42B3/0406Accessories for helmets
    • AHUMAN NECESSITIES
    • A42HEADWEAR
    • A42BHATS; HEAD COVERINGS
    • A42B3/00Helmets; Helmet covers ; Other protective head coverings
    • A42B3/04Parts, details or accessories of helmets
    • A42B3/0406Accessories for helmets
    • A42B3/0433Detecting, signalling or lighting devices
    • A42B3/046Means for detecting hazards or accidents
    • AHUMAN NECESSITIES
    • A42HEADWEAR
    • A42BHATS; HEAD COVERINGS
    • A42B3/00Helmets; Helmet covers ; Other protective head coverings
    • A42B3/04Parts, details or accessories of helmets
    • A42B3/18Face protection devices
    • A42B3/22Visors
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/33Services specially adapted for particular environments, situations or purposes for indoor environments, e.g. buildings
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/38Services specially adapted for particular environments, situations or purposes for collecting sensor information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/70Services for machine-to-machine communication [M2M] or machine type communication [MTC]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/90Services for handling of emergency or hazardous situations, e.g. earthquake and tsunami warning systems [ETWS]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W64/00Locating users or terminals or network equipment for network management purposes, e.g. mobility management
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Business, Economics & Management (AREA)
  • Health & Medical Sciences (AREA)
  • Emergency Management (AREA)
  • Environmental & Geological Engineering (AREA)
  • Public Health (AREA)
  • Studio Devices (AREA)

Abstract

An AR fire helmet based on a distributed network is characterized in that AR goggles are arranged at the visual angle position, an infrared thermal imager and a visible light camera are arranged at the front end of the AR fire helmet, a communication and positioning unit is arranged on one side of a main body, and signals of the AR goggles, the communication and positioning unit, the infrared thermal imager and the visible light camera are all connected to a control component arranged on the other side of the main body; the main control unit of the control assembly receives image data of the thermal infrared imager and the visible light camera and data in the communication and positioning unit, the main control unit sends the image data of the visible light camera to the finger control center through the image transmission unit in the communication and positioning unit, the main control unit also performs joint control with the auxiliary control unit at the waist of a fireman to sense combustible gas, is connected with the sensing unit at the wrist of the fireman to sense vital signs of the fireman and wirelessly transmits the sensing unit with the positioning unit at the foot bottom, and inertial navigation positioning is realized; the main control unit reconstructs image data, and the image data is displayed through the AR goggles.

Description

AR fire helmet based on distributed network
Technical Field
The invention belongs to the technical field of fire-fighting helmets, and particularly relates to an AR fire-fighting helmet based on a distributed network.
Background
In recent years, urban population is increasing, building density is increasing, the situation of fire scene is becoming more and more complicated, the fire fighters in the fire scene face more serious rescue challenge and life threat, and many fire fighters sacrifice for protecting the safety of people's lives and property. The head is particularly required to be protected more safely as the weakest part of the human body, and the sensing organ of the head needs to be guided better in a complex fire scene.
The existing helmet has a single function, generally can only play a role in protection, is used for protecting the head of a fireman from being injured by the outside world, and can only additionally wear other equipment if the intellectualization is realized.
Disclosure of Invention
In order to overcome the defects of the prior art, the invention aims to provide an AR fire-fighting helmet based on a distributed network, which integrates various technical means, provides an auxiliary rescue system scheme with various detection and sensing technologies, purposefully overcomes or avoids the influence of the AR fire-fighting helmet on fire fighters aiming at the adverse conditions in a fire scene, overcomes or avoids the adverse conditions in the fire scene, such as dense smoke, darkness, gas leakage and the like, and furthest protects the life safety of the fire fighters while assisting fire-fighting rescue.
In order to achieve the purpose, the technical scheme of the invention is as follows:
an AR fire helmet based on a distributed network is characterized in that AR goggles are arranged at the visual angle position of a main body, an infrared thermal imager and a visible light camera are arranged at the front end of the main body, a communication and positioning unit is arranged on one side of the main body, and signals of the AR goggles, the communication and positioning unit, the infrared thermal imager and the visible light camera are all connected to a control component arranged on the other side of the main body; the control assembly comprises a main control unit, the main control unit receives image data of an infrared thermal Imager (IR) and a visible light camera (CCD), communication unit data of a 4G part in a communication and positioning unit data of a GPS part in the communication and positioning unit, the main control unit sends the image data of the visible light camera (CCD) to a command center through a picture transmission unit in the communication and positioning unit, the main control unit is also in joint control with an auxiliary control unit at the waist of a fireman, combustible gas is sensed through a sensing unit connected with the auxiliary control unit, the main control unit is also connected with a sensing unit at the wrist of the fireman to sense vital signs of the fireman and is in wireless transmission with a positioning unit at the foot bottom, and inertial navigation positioning is realized; the main control unit carries out three-dimensional contour reconstruction on image data, displays the reconstructed image through AR goggles, combines satellite positioning and inertial navigation positioning to realize personnel positioning in the building, and returns position information to the command center.
The AR display of the AR goggles 1 is an optical waveguide module or a free-form surface module.
The method is characterized in that the satellite positioning is combined with inertial navigation positioning, wherein an RTK positioning module is selected for satellite positioning, an F9 series product of UBLOX company is selected for inertial navigation positioning, an max21100 of Maxim company is selected for inertial navigation positioning, RTK positioning acts around a fire scene, inertial navigation positioning acts in the fire scene, a plurality of RTKs are arranged at the edge of the fire scene, a virtual three-dimensional limit of the fire scene is covered by points and surfaces, a fire scene limit set by an RTK can be triggered firstly when a fire fighter wearing a single RTK enters the fire scene, an initial position of the fire fighter entering the fire scene is obtained by a background of a finger control center, relative data of the inertial navigation positioning is calculated based on the position coordinates, and finally absolute position information of a person is obtained by calculating RTK coordinates and inertial navigation data.
The satellite positioning and inertial navigation positioning are combined, and specifically the method comprises the following steps:
triggering a fire field boundary may be considered as a collision of the mobile coordinates with the solid boundary, when the mobile coordinates (x0, y0, z0) satisfy the following condition:
x0 ═ P1.x | P2.x) | y0 ═ P1.y | P2.y) | z0 ═ P1.z | P2.z, P1, and P2 are the opposite corners of the boundary.
The inertial navigation acceleration is integrated to obtain a velocity component, namely:
Figure BDA0002268946070000031
Figure BDA0002268946070000032
Figure BDA0002268946070000033
the velocity components are respectively integrated again to obtain the position variation, and are added with the trigger positions (x0, y0, z0) to obtain the current position information, namely:
Figure BDA0002268946070000034
Figure BDA0002268946070000035
Figure BDA0002268946070000036
the main control single pair of image data is subjected to three-dimensional contour reconstruction, and the method specifically comprises the following steps:
carrying out improved gray level normalization processing on the video stream acquired by the thermal infrared imager to obtain a gray level video stream; the gray level video stream without noise points is obtained through improved homomorphic filtering processing, and more video contour details are obtained through homomorphic filtering transfer functions.
Figure BDA0002268946070000037
D (a, b), K (a, b) are videos before and after normalization respectively, m0 and n0 are mean values and variance after normalization respectively, and m and n are mean values and variance before two images are normalized respectively;
processing the normalized grayscale video D (a, b) by using an improved homomorphic filter function K (x, y), and finally obtaining a grayscale video stream without noise points;
K'(x,y)=exp(Z(x,y)+D(a,b));
(2) then, second-order difference processing is carried out on the gray matrix of the image according to columns, and finally, points with absolute values larger than a set threshold value are extracted from the gray matrix after second-order difference, and then the contour video stream can be extracted;
the method comprises the steps that an infrared video stream is used as a reference standard, the infrared video stream and a profile video stream are associated in the same coordinate system through a matching algorithm to be used for determining the position of a user in a physical environment, the profile video stream is subjected to an efficient proportional correction algorithm with parameters calibrated in advance to obtain an ultralow delay time stable non-offset profile video, and then the ultralow delay time stable non-offset profile video is transmitted through an HDMI digital video interface protocol;
Figure BDA0002268946070000041
and constructing a calibration model by the above formula, wherein cx and cy are optical center coordinates of the thermal infrared imager, fx and fy are focal lengths of the thermal infrared imager, and xc, yc and zc are coordinates of the camera in a virtual plane coordinate system. uv3 and vv3 are parameters to be calibrated, the number of the calibration points is set to be m, calibration reference point information is obtained through solving in a nonlinear regression mode, and finally, the images of the contour video stream are subjected to proportion matching, and the contour video is displayed on the AR goggles.
The invention has the beneficial effects that:
(1) the method of integrating positioning is used, the method of combining satellite positioning and inertial navigation positioning is used for realizing personnel positioning in the building, and the position information is returned to the command center; the position information of the fire fighter is obtained more accurately;
(2) three-dimensional contour reconstruction is carried out on the image data, and the three-dimensional contour reconstruction is displayed on an AR (augmented reality) goggles, so that more specific guidance is provided for fire fighters;
(3) the communication and positioning unit comprises an access 4G communication network, and cloud access of state data and images is realized.
Detailed Description
The present invention will be described in detail with reference to the accompanying drawings.
Referring to fig. 1, an AR fire helmet based on a distributed network includes a main body, an AR goggle 1 disposed at a viewing angle position of the main body, a thermal infrared imager 2 and a visible light camera 3 disposed at a front end of the main body, a communication and positioning unit 4 disposed at one side of the main body, and a control assembly 5 disposed at the other side of the main body and connected to signals of the AR goggle 1, the communication and positioning unit 4, the thermal infrared imager 2(IR) and the visible light camera 3 (CCD); referring to fig. 2, the control assembly 5 comprises a main control unit, the main control unit receives image data of a thermal infrared imager and a visible light camera, and data of a communication unit of a 4G part and data of a positioning unit of a GPS part in the communication and positioning unit, the main control unit sends the image data of the visible light camera (CCD) to a command center through an image transmission unit in the communication and positioning unit, the main control unit is also in joint control with an auxiliary control unit at the waist of a firefighter, combustible gas is sensed through a sensing unit connected with the auxiliary control unit, the main control unit is also connected with a sensing unit at the wrist of the firefighter to sense vital signs of the firefighter, and the vital signs are wirelessly transmitted with the positioning unit at the sole of the foot, so that inertial navigation positioning is realized; the main control unit carries out three-dimensional contour reconstruction on image data, displays the reconstructed image through the AR goggles 1, combines satellite positioning and inertial navigation positioning to realize personnel positioning in the building, and returns position information to the command center.
Referring to fig. 3, the satellite positioning and the inertial navigation positioning are combined, wherein the satellite positioning selects an RTK positioning module, selects an F9 series product from UBLOX corporation, the inertial navigation positioning selects an inertial navigation module, selects max21100 from Maxim corporation, the RTK positioning mainly acts around the fire scene, and the inertial navigation positioning mainly acts in the fire scene. Firstly, a plurality of RTKs are arranged at the edge of a fire scene, a virtual three-dimensional boundary limit covering the fire scene is formed by points and surfaces, if a fire fighter wearing a single RTK enters the fire scene, the fire scene boundary limit set by the RTK is triggered firstly, an initial position of the fire fighter entering the fire scene is obtained by a background of a finger control center, and relative data of inertial navigation positioning is calculated based on the position coordinates. And finally, calculating the RTK coordinates and inertial navigation data to obtain absolute position information of the person.
Considering the variability of fire conditions, fire fighters can enter a fire scene (such as the ground, a half space, a roof and the like) from any position, compared with the common single inertial navigation positioning, the method has the greatest advantage that the fire scene can be entered at any position without intentionally triggering the initial point of the inertial navigation positioning before entering the fire scene. And the positioning data is recalculated each time the limit is triggered when entering or exiting the fire many times.
Triggering a fire field boundary may be considered as a collision of the mobile coordinate with the solid boundary, when the mobile coordinate (x) is present0,y0,z0) The following conditions are satisfied:
x0=(P1.x|P2.x)|y0=(P1.y|P2.y)|z0=(P1.z|P2.z),P1,P2diagonal to the boundary.
The inertial navigation acceleration is integrated to obtain a velocity component, namely:
Figure BDA0002268946070000061
Figure BDA0002268946070000062
Figure BDA0002268946070000063
the velocity components are integrated again to obtain the position variation and are compared with the trigger position (x)0,y0,z0) Adding to obtain the current position information, namely:
Figure BDA0002268946070000064
Figure BDA0002268946070000065
Figure BDA0002268946070000066
the sensing unit at the waist is an environment sensing part and is mainly used for measuring environment data such as combustible gas content, temperature and humidity in the environment, an electrochemical sensor is selected as the combustible gas sensor, and SHT21 is selected as the temperature and humidity sensor.
The sensing unit of wrist be sign sensing part, mainly measure indexes such as fire fighter's rhythm of the heart, blood pressure, sign sensor equips in fire fighter with the bracelet form.
The AR of AR goggles 1 show that what select for use is optical waveguide module or free-form surface module, its main specification parameter as follows:
Specifications optical waveguide module Free-form surface module
Viewing angle (FOV) 40° 40°
Resolution ratio 1920*1080p 1080*720p
Thickness of lens 1.5mm 8mm
Distance of exit pupil 18mm 16mm
Eye Box 8*8mm 8*8mm
Light transmittance >82% >84%
LCoS size 0.39” 0.37”
Virtual screen size 86”@3m 85”@3m
Number of frames 60Hz 120Hz
Distortion of <2% <2%
MTF 100lp/mm 100lp/mm
Uniformity of brightness >80% >80%

Claims (5)

1. An AR fire helmet based on a distributed network is characterized in that AR goggles are arranged at the visual angle position of a main body, an infrared thermal imager and a visible light camera are arranged at the front end of the main body, a communication and positioning unit is arranged on one side of the main body, and signals of the AR goggles, the communication and positioning unit, the infrared thermal imager and the visible light camera are all connected to a control component arranged on the other side of the main body; the control assembly comprises a main control unit, the main control unit receives image data of the thermal infrared imager and the visible light camera, communication unit data of a 4G part in the communication and positioning unit data of a GPS part in the communication and positioning unit, the main control unit sends the image data of the visible light camera to a command center through an image transmission unit in the communication and positioning unit, the main control unit is also in joint control with an auxiliary control unit at the waist of a fireman, a sensing unit connected with the auxiliary control unit senses combustible gas, the main control unit is also connected with the sensing unit at the wrist of the fireman to sense vital signs of the fireman and wirelessly transmits the vital signs and the positioning unit at the sole of the fireman, and inertial navigation positioning is realized; the main control unit carries out three-dimensional contour reconstruction on image data, displays the reconstructed image through AR goggles, combines satellite positioning and inertial navigation positioning to realize personnel positioning in the building, and returns position information to the command center.
2. The distributed network-based AR fire-fighting helmet of claim 1, wherein the AR display of the AR goggles is selected from an optical waveguide module or a free-form surface module.
3. The distributed network based AR fire helmet as recited in claim 1, wherein said satellite positioning is combined with inertial navigation positioning, wherein said satellite positioning is selected from RTK positioning module, UBLOX corporation F9 series product; the inertial navigation positioning selects an inertial navigation module, max21100 of Maxim company is selected, RTK positioning acts on the periphery of a fire scene, inertial navigation positioning acts on the fire scene, a plurality of RTKs are arranged at the edge of the fire scene, a virtual three-dimensional boundary covering the fire scene is formed by points and surfaces, a fire fighter wearing a single RTK can trigger the fire scene boundary set by the RTK at first when entering the fire scene, an initial position of the fire fighter entering the fire scene is obtained by a background of a command center, relative data of inertial navigation positioning is calculated based on the position coordinates, and finally, the absolute position information of a person is obtained by calculating the RTK coordinates and the inertial navigation data.
4. The distributed network-based AR fire helmet of claim 1, wherein the satellite positioning and inertial navigation positioning are combined, specifically:
triggering a fire field boundary may be considered as a collision of the mobile coordinate with the solid boundary, when the mobile coordinate (x) is present0,y0,z0) The following conditions are satisfied:
x0=(P1.x|P2.x)|y0=(P1.y|P2.y)|z0=(P1.z|P2.z),P1,P2diagonal to the boundary.
The inertial navigation acceleration is integrated to obtain a velocity component, namely:
Figure FDA0002268946060000021
Figure FDA0002268946060000022
Figure FDA0002268946060000023
the velocity components are integrated again to obtain the position variation and are compared with the trigger position (x)0,y0,z0) Adding to obtain the current position information, namely:
Figure FDA0002268946060000024
Figure FDA0002268946060000025
Figure FDA0002268946060000026
5. the distributed network-based AR fire helmet of claim 1, wherein the master control unit performs three-dimensional contour reconstruction on the image data, specifically:
carrying out improved gray level normalization processing on the video stream acquired by the thermal infrared imager to obtain a gray level video stream; the gray level video stream without noise points is obtained through improved homomorphic filtering processing, and more video contour details are obtained through homomorphic filtering transfer functions.
Figure FDA0002268946060000031
D (a, b), K (a, b) are videos before and after normalization respectively, and m0,n0Respectively is the mean value and the variance after the normalization, and m and n are the mean value and the variance before the normalization of the two images;
processing the normalized grayscale video D (a, b) by using an improved homomorphic filter function K (x, y), and finally obtaining a grayscale video stream without noise points;
K'(x,y)=exp(Z(x,y)+D(a,b));
(2) then, second-order difference processing is carried out on the gray matrix of the image according to columns, and finally, points with absolute values larger than a set threshold value are extracted from the gray matrix after second-order difference, and then the contour video stream can be extracted;
the method comprises the steps that an infrared video stream is used as a reference standard, the infrared video stream and a profile video stream are associated in the same coordinate system through a matching algorithm to be used for determining the position of a user in a physical environment, the profile video stream is subjected to an efficient proportional correction algorithm with parameters calibrated in advance to obtain an ultralow-delay stable non-offset profile video, and then the ultralow-delay stable non-offset profile video is transmitted through an HDMI (high-definition multimedia interface) digital video interface protocol;
Figure FDA0002268946060000032
constructing a calibration model by the above formula, c in the above formulax、cyIs the optical center coordinate, f, of a thermal infrared imagerxAnd fyIs the focal length, x, of the thermal infrared imagerc、ycAnd zcIs the coordinates of the camera in a virtual plane coordinate system. u. ofv3And vv3Setting the number of calibration points as m, solving in a nonlinear regression mode to obtain calibration reference point information, and finally performing proportion matching on the image of the profile video stream to display the profile video on the AR goggles.
CN201911097985.3A 2019-11-12 2019-11-12 AR fire helmet based on distributed network Pending CN110859352A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911097985.3A CN110859352A (en) 2019-11-12 2019-11-12 AR fire helmet based on distributed network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911097985.3A CN110859352A (en) 2019-11-12 2019-11-12 AR fire helmet based on distributed network

Publications (1)

Publication Number Publication Date
CN110859352A true CN110859352A (en) 2020-03-06

Family

ID=69653502

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911097985.3A Pending CN110859352A (en) 2019-11-12 2019-11-12 AR fire helmet based on distributed network

Country Status (1)

Country Link
CN (1) CN110859352A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114894253A (en) * 2022-05-18 2022-08-12 威海众合机电科技有限公司 Emergency visual sense intelligent enhancement method, system and equipment
EP4176751A1 (en) * 2021-11-09 2023-05-10 Brandbull Polska S.A. Smart rescue helmet, especially for firefighters

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104155670A (en) * 2014-08-01 2014-11-19 中国矿业大学 Battle tracking system and method aiming at firefighters at fire scene
CN107095384A (en) * 2017-04-26 2017-08-29 长春理工大学 A kind of Intelligent fire-fighting helmet device transmitted based on WIFI
CN108378450A (en) * 2018-03-08 2018-08-10 公安部天津消防研究所 A kind of perception of blast accident and risk profile early warning Intelligent fire-fighting helmet implementation method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104155670A (en) * 2014-08-01 2014-11-19 中国矿业大学 Battle tracking system and method aiming at firefighters at fire scene
CN107095384A (en) * 2017-04-26 2017-08-29 长春理工大学 A kind of Intelligent fire-fighting helmet device transmitted based on WIFI
CN108378450A (en) * 2018-03-08 2018-08-10 公安部天津消防研究所 A kind of perception of blast accident and risk profile early warning Intelligent fire-fighting helmet implementation method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
宋庆峰等: "一种基于小波变换的图像增强方法" *
龙建辉等: "基于图像轮廓曲线提取的CT系统参数标定" *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4176751A1 (en) * 2021-11-09 2023-05-10 Brandbull Polska S.A. Smart rescue helmet, especially for firefighters
CN114894253A (en) * 2022-05-18 2022-08-12 威海众合机电科技有限公司 Emergency visual sense intelligent enhancement method, system and equipment

Similar Documents

Publication Publication Date Title
US10488659B2 (en) Apparatus, systems and methods for providing motion tracking using a personal viewing device
US9973692B2 (en) Situational awareness by compressed display of panoramic views
KR102544062B1 (en) Method for displaying virtual image, storage medium and electronic device therefor
US20080023002A1 (en) Head safety device with integrated display
TWI633336B (en) Helmet mounted display, visual field calibration method thereof, and mixed reality display system
WO2016013269A1 (en) Image display device, image display method, and computer program
US11890494B2 (en) Retrofittable mask mount system for cognitive load reducing platform
US20210350762A1 (en) Image processing device and image processing method
Orlosky et al. Fisheye vision: peripheral spatial compression for improved field of view in head mounted displays
AU2013101173A4 (en) Fire Fighter&#39;s Helmet
CN110859352A (en) AR fire helmet based on distributed network
CN104769487A (en) Face mounted extreme environment thermal sensor system
US20010049837A1 (en) Helmet
CN111127822B (en) Augmented reality-based fire-fighting auxiliary method and intelligent wearable equipment
JPH0854282A (en) Head-mounted display device
CN111603134A (en) Eyeball movement testing device and method
CN109303987A (en) For the enhancing what comes into a driver&#39;s of fire fighter sensed using head-up display and gesture
US20190380638A1 (en) System and Method for Assessment and Rehabilitation of Balance Impairment Using Virtual Reality
GB2569323A (en) Head-mountable apparatus and methods
US20020053101A1 (en) Helmet
CN108572450B (en) Head-mounted display, visual field correction method thereof and mixed reality display system
CN211021144U (en) Helmet for fire rescue and positioning
CN211323223U (en) AR fire helmet based on distributed network
CN211014852U (en) AR fire control goggles
CN110989168A (en) AR fire control goggles

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination