CN110389664A - A kind of fire scenario sunykatuib analysis device and method based on augmented reality - Google Patents

A kind of fire scenario sunykatuib analysis device and method based on augmented reality Download PDF

Info

Publication number
CN110389664A
CN110389664A CN201910555660.9A CN201910555660A CN110389664A CN 110389664 A CN110389664 A CN 110389664A CN 201910555660 A CN201910555660 A CN 201910555660A CN 110389664 A CN110389664 A CN 110389664A
Authority
CN
China
Prior art keywords
augmented reality
fire
scene
gesture
binocular camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910555660.9A
Other languages
Chinese (zh)
Other versions
CN110389664B (en
Inventor
张婷
王冠云
江浩
霍日坤
赵晓亮
郭伟
赵艺钧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University ZJU
Original Assignee
Zhejiang University ZJU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University ZJU filed Critical Zhejiang University ZJU
Priority to CN201910555660.9A priority Critical patent/CN110389664B/en
Publication of CN110389664A publication Critical patent/CN110389664A/en
Application granted granted Critical
Publication of CN110389664B publication Critical patent/CN110389664B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a kind of fire scenario sunykatuib analysis device and method based on augmented reality, belong to augmented reality equipment technical field, augmented reality scene of the invention is formed by following four layers of map overlay: the ambient video of binocular camera acquisition, noxious gas component after the burning of real-time fire scenario 3D model, combustible material is built into the analysis of SLAM algorithm, smog flows in fire;Entire augmented reality device major part function can be realized by smart phone, complete equipment is not needed to buy, user can be allowed to be put into less hardware cost, tool is drilled as a kind of fire disaster escaping, the ingredient of noxious material after augmented reality device of the invention can also show possible combustible in user's surrounding enviroment and burn, and in fire smog flow direction, have stronger moral.

Description

A kind of fire scenario sunykatuib analysis device and method based on augmented reality
Technical field
The present invention relates to augmented reality equipment technical fields, specifically, being related to a kind of Fire Field based on augmented reality Scape sunykatuib analysis device and method, the device and method can be can be realized using conventional mobile phone, utilize screen existing in mobile phone The components such as curtain, sensor, wireless network can provide fire scenario sunykatuib analysis function with less cost for user.
Background technique
Currently, the virtual reality device of mobile platform is generally made of smart phone, general shell, binocular lens etc., city A small number of devices can also be equipped with independent Trackpad, three axis acceleration sensors etc. on face, generate the controller interacted as with user. The virtual reality device of existing mobile platform is principally dedicated to the fields such as user's viewing, vision game, but its do not need it is external High-performance computer, without cable, moveable advantage, there is no use well.User puts on traditional virtual real device After (glasses), the picture in screen can only see at the moment, cannot see that the environment of surrounding, surrounding enviroment can not be divided Analysis.
Summary of the invention
In view of the above-mentioned deficiencies in the prior art, it is an object of the present invention to provide a kind of fire scenario simulation based on augmented reality Analysis method.
The technical solution adopted by the invention is as follows:
A kind of fire scenario analog analysing method based on augmented reality, includes the following steps:
Step 1: ambient video is acquired by binocular camera;
Step 2: the ambient video acquired according to binocular camera is built into real-time fire scenario with the analysis of SLAM algorithm 3D model;
The combustible material in method for distinguishing identification scene is known with image texture, and flame, smog special efficacy are superimposed to it;
According to the noxious gas component after the burning of the material analysis of combustible material, and shown;
In conjunction with scene spatial model and real-time wind direction, the flow direction of smog in fire is shown;
Above-mentioned superimposed video is exported as a result to screen;
Step 3: the gesture model that gesture sensor identifies being superimposed upon on augmented reality video, and makes to establish in scene On dummy model to gesture command generate interaction feedback.
A kind of fire scenario sunykatuib analysis device based on augmented reality, including binocular camera, structure light gesture sensing Device, screen, processor, binocular camera acquires ambient image information, and is transmitted to screen and processor, and processor superposition is virtual In scene to ambient image, formed augmented reality scene, structure light gesture sensor detection gesture information, be transmitted to processor into Row processing, executes as interactive command.
The beneficial effects of the present invention are:
The present invention uses augmented reality device and method, and user can be allowed to be put into less hardware cost, largely borrowed The existing hardware sensor of smart phone is helped, complete equipment is not needed to buy.When using the augmented reality equipment of mobile platform, face Pair will be the one and half true half virtual worlds, have stronger deeper feeling of immersion, ambient enviroment can be perceived when in use, User can arbitrarily walk with augmented reality equipment.Tool, augmented reality dress of the invention are drilled as a kind of fire disaster escaping Set after can also showing possible combustible in user's surrounding enviroment and burning smog in the ingredient and fire of noxious material Flow direction, have stronger moral.
Detailed description of the invention
Fig. 1 is a kind of specific method flow diagram of the present invention;
Fig. 2 is the augmented reality apparatus function module diagram in a kind of specific example of the present invention;
Fig. 3 is the augmented reality device output figure layer schematic diagram in a kind of specific example of the present invention;
Fig. 4 is the structural schematic diagram of the augmented reality device in a kind of specific example of the present invention.
Specific embodiment
To make the object, technical solutions and advantages of the present invention clearer, with reference to embodiments and its attached drawing is to this hair It is bright to be described further.
Embodiment
Fig. 1 is the flow chart that the fire scenario sunykatuib analysis function in one embodiment based on augmented reality is realized.The party Method includes:
1, surrounding dynamic scene is obtained by binocular camera
Binocular camera is the camera of two same models arranged side by side, and the distance between its spacing and people two is identical, and Binocular camera will keep step completely the same in terms of image transmitting.
2, by binocular scene image transmitting to smart phone
Smart phone generally comprises the modules such as central processing unit, screen, 3-axis acceleration sensor, camera.
In the embodiment, core component of the smart phone as augmented reality device is undertaken including image operation, angle side It is as shown in Figure 2 to most of task, functional structures such as sensing, image outputs.
Its step specifically: the collected binocular image of binocular camera in equipment passes through mobile phone as unit of frame OTG interface is transmitted to mobile phone, is analyzed as the central processing unit built in smart phone with SLAM algorithm, constructs week in real time Enclose the spatial model of dynamic scene.
3, different special efficacys is added for augmented reality scene
According to the difference of augmented reality scene, need to make different degrees of modification, superposition to model of place by software.
For example, software can be transported according to the model of place constructed in real time in the augmented reality scene of simulated fire escape Method for distinguishing is known with image texture, identifies article flammable in scene around, on it superposition simulation flame, smog special efficacy.
4, in conjunction with space fluid model, special efficacy is calculated in real time
In the embodiment, in the scene of simulated fire escape, the combustible materials that augmented reality system simulates burn fiery Flame, dense smoke follow fluid motion physics law, and (concrete methods of realizing can be found in " research of the fluid emulation in fire scenario simulation With realization " Song's height), and can be with model of place flowing around.The ingredient of its pernicious gas generated after burning can be with difference The form of color is distinguish in augmented reality scene.
5, user gesture is obtained by gesture sensor.
Gesture sensor uses structure light principle, is added by an infrared laser point cloud emitter and an infrared camera Filter set is at infrared laser point cloud is irradiated to user's hand, and the image of reflection is received with infrared camera, identifies its texture It rises and falls, the model of user gesture can be generated.Its finger movement is analyzed again, can be translated as gesture interaction order.
6, gesture model is superimposed upon in augmented reality scene
The model of user gesture is transferred to smart phone by mobile phone OTG interface, true in calculate generation half virtual half Augmented reality scene on, superposition gesture model simultaneously can recognize that gesture, the movement such as push away, draw, losing, grabbing, clapping hands, as Interactive command executes, on Real-time Feedback to final user interface.
The image that end user is seen is made of four figure layers of different levels, by smart phone screen through saturating Mirror shows, as shown in Figure 3.

Claims (2)

1. a kind of fire scenario analog analysing method based on augmented reality, which comprises the steps of:
Step 1: ambient video is acquired by binocular camera;
Step 2: the ambient video acquired according to binocular camera is built into real-time fire scenario 3D mould with the analysis of SLAM algorithm Type;
The combustible material in method for distinguishing identification scene is known with image texture, and flame, smog special efficacy are superimposed to it;
According to the noxious gas component after the burning of the material analysis of combustible material, and shown;
In conjunction with scene spatial model and real-time wind direction, the flow direction of smog in fire is shown;
Above-mentioned superimposed video is exported as a result to screen;
Step 3: the gesture model that gesture sensor identifies being superimposed upon on augmented reality video, and makes to establish on scene Dummy model to gesture command generate interaction feedback.
2. a kind of fire scenario sunykatuib analysis device based on augmented reality, which is characterized in that including binocular camera, structure light Gesture sensor, screen, processor, binocular camera acquires ambient image information, and is transmitted to screen and processor, processor It is superimposed on virtual scene to ambient image, forms augmented reality scene, structure light gesture sensor detection gesture information is transmitted to Processor is handled, and is executed as interactive command.
CN201910555660.9A 2019-06-25 2019-06-25 Fire scene simulation analysis device and method based on augmented reality Active CN110389664B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910555660.9A CN110389664B (en) 2019-06-25 2019-06-25 Fire scene simulation analysis device and method based on augmented reality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910555660.9A CN110389664B (en) 2019-06-25 2019-06-25 Fire scene simulation analysis device and method based on augmented reality

Publications (2)

Publication Number Publication Date
CN110389664A true CN110389664A (en) 2019-10-29
CN110389664B CN110389664B (en) 2020-09-01

Family

ID=68285900

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910555660.9A Active CN110389664B (en) 2019-06-25 2019-06-25 Fire scene simulation analysis device and method based on augmented reality

Country Status (1)

Country Link
CN (1) CN110389664B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111105660A (en) * 2019-11-27 2020-05-05 重庆特斯联智慧科技股份有限公司 Augmented reality stereoscopic display method and system for fire drill
CN111640277A (en) * 2020-04-22 2020-09-08 哈尔滨工程大学 Fire condition display system
CN111951156A (en) * 2020-08-24 2020-11-17 杭州趣维科技有限公司 Method for drawing photoelectric special effect of graph

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102142055A (en) * 2011-04-07 2011-08-03 上海大学 True three-dimensional design method based on augmented reality interactive technology
CN105955456A (en) * 2016-04-15 2016-09-21 深圳超多维光电子有限公司 Virtual reality and augmented reality fusion method, device and intelligent wearable equipment
US20180190030A1 (en) * 2016-12-30 2018-07-05 Baidu Online Network Technology (Beijing) Co., Ltd. Display method and apparatus
CN108446018A (en) * 2018-02-12 2018-08-24 上海青研科技有限公司 A kind of augmented reality eye movement interactive system based on binocular vision technology
CN108464554A (en) * 2018-04-03 2018-08-31 深圳市晨芯成科技有限公司 The fire-fighting AR helmets, police AR glasses based on artificial intelligence image recognition and application
CN108519814A (en) * 2018-03-21 2018-09-11 北京科技大学 A kind of man-machine interactive operation system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102142055A (en) * 2011-04-07 2011-08-03 上海大学 True three-dimensional design method based on augmented reality interactive technology
CN105955456A (en) * 2016-04-15 2016-09-21 深圳超多维光电子有限公司 Virtual reality and augmented reality fusion method, device and intelligent wearable equipment
US20170301137A1 (en) * 2016-04-15 2017-10-19 Superd Co., Ltd. Method, apparatus, and smart wearable device for fusing augmented reality and virtual reality
US20180190030A1 (en) * 2016-12-30 2018-07-05 Baidu Online Network Technology (Beijing) Co., Ltd. Display method and apparatus
CN108446018A (en) * 2018-02-12 2018-08-24 上海青研科技有限公司 A kind of augmented reality eye movement interactive system based on binocular vision technology
CN108519814A (en) * 2018-03-21 2018-09-11 北京科技大学 A kind of man-machine interactive operation system
CN108464554A (en) * 2018-04-03 2018-08-31 深圳市晨芯成科技有限公司 The fire-fighting AR helmets, police AR glasses based on artificial intelligence image recognition and application

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111105660A (en) * 2019-11-27 2020-05-05 重庆特斯联智慧科技股份有限公司 Augmented reality stereoscopic display method and system for fire drill
CN111105660B (en) * 2019-11-27 2021-11-30 重庆特斯联智慧科技股份有限公司 Augmented reality stereoscopic display method and system for fire drill
CN111640277A (en) * 2020-04-22 2020-09-08 哈尔滨工程大学 Fire condition display system
CN111951156A (en) * 2020-08-24 2020-11-17 杭州趣维科技有限公司 Method for drawing photoelectric special effect of graph
CN111951156B (en) * 2020-08-24 2021-08-03 杭州小影创新科技股份有限公司 Method for drawing photoelectric special effect of graph

Also Published As

Publication number Publication date
CN110389664B (en) 2020-09-01

Similar Documents

Publication Publication Date Title
CN103793060B (en) A kind of user interactive system and method
RU2621633C2 (en) System and method for augmented and virtual reality
CN103713737B (en) Virtual keyboard system used for Google glasses
CN110389664A (en) A kind of fire scenario sunykatuib analysis device and method based on augmented reality
CN110413105A (en) The tangible visualization of virtual objects in virtual environment
CN105374251A (en) Mine virtual reality training system based on immersion type input and output equipment
CN106600709A (en) Decoration information model-based VR virtual decoration method
JP2017147001A (en) Massive simultaneous remote digital presence world
US20140198096A1 (en) Mobile tele-immersive gameplay
CN110969905A (en) Remote teaching interaction and teaching aid interaction system for mixed reality and interaction method thereof
US20210286432A1 (en) Information processing device, information processing method, and program
CN105184858A (en) Method for augmented reality mobile terminal
KR20210036975A (en) Display device sharing and interactivity in simulated reality (SR)
CN106293099A (en) Gesture identification method and system
Li et al. A Fire Drill Training System Based on VR and Kinect Somatosensory Technologies.
KR101864717B1 (en) The apparatus and method for forming a augmented reality contents with object shape
Eriksson et al. Movement-based interaction in camera spaces: a conceptual framework
CN111083391A (en) Virtual-real fusion system and method thereof
Lee et al. Goldeye: Enhanced spatial awareness for the visually impaired using mixed reality and vibrotactile feedback
CN116682293A (en) Experiment teaching system based on augmented reality and machine vision
CN110262662A (en) A kind of intelligent human-machine interaction method
CN114020978A (en) Park digital roaming display method and system based on multi-source information fusion
Kim et al. Implementation of Local Area VR Environment using Mobile HMD and Multiple Kinects.
KR101526049B1 (en) Virtual ecology park visualization system
Garcia et al. Modifying a game interface to take advantage of advanced I/O devices

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CB03 Change of inventor or designer information

Inventor after: Zhang Ting

Inventor after: Tao Ye

Inventor after: Wang Guanyun

Inventor after: Jiang Hao

Inventor after: Huo Rikun

Inventor after: Zhao Xiaoliang

Inventor after: Guo Wei

Inventor after: Zhao Yijun

Inventor before: Zhang Ting

Inventor before: Wang Guanyun

Inventor before: Jiang Hao

Inventor before: Huo Rikun

Inventor before: Zhao Xiaoliang

Inventor before: Guo Wei

Inventor before: Zhao Yijun

CB03 Change of inventor or designer information