CN109377560A - A kind of method of Outdoor Augmented Reality military simulation-based training - Google Patents

A kind of method of Outdoor Augmented Reality military simulation-based training Download PDF

Info

Publication number
CN109377560A
CN109377560A CN201811256948.8A CN201811256948A CN109377560A CN 109377560 A CN109377560 A CN 109377560A CN 201811256948 A CN201811256948 A CN 201811256948A CN 109377560 A CN109377560 A CN 109377560A
Authority
CN
China
Prior art keywords
military
rendering
rendering engine
scene
real
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811256948.8A
Other languages
Chinese (zh)
Inventor
陈靖
缪远东
张凡
张一凡
樊蕾
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Technology BIT
Original Assignee
Beijing Institute of Technology BIT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Technology BIT filed Critical Beijing Institute of Technology BIT
Priority to CN201811256948.8A priority Critical patent/CN109377560A/en
Publication of CN109377560A publication Critical patent/CN109377560A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/20Finite element generation, e.g. wire-frame surface description, tesselation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models

Abstract

The present invention provides a kind of method of the generation of augmented reality Training Outdoor Military simulating scenes, and process is as follows: the tri patch cartographic model of true environment being imported in 3 d rendering engine, and collision body is arranged for collision detection;Coordinate of the Military Simulation target under 3 d rendering engine world coordinate system is calculated according to the GPS longitude and latitude and elevation information of given video camera and Military Simulation target, and is added it in three-dimensional rendering scene;The video image that current camera obtains is read into 3 d rendering engine, and is arranged to background;Six-degree-of-freedom posture information of the video camera relative to real scene is calculated in real time, and is assigned to the virtual camera in 3 d rendering engine;Virtual reality fusion scene rendering can be completed in virtual reality fusion scenic picture and display the screen data transmission of final rendering to show to display apparatus.What the present invention can be realized virtual military target and real scene merges display and real-time, interactive, effectively solves the problems, such as the fidelity of military training simulation field.

Description

A kind of method of Outdoor Augmented Reality military simulation-based training
Technical field
The invention belongs to information technology fields, and in particular to a kind of method of Outdoor Augmented Reality military simulation-based training.
Background technique
Ground force's development has marched toward the new historical stage, makes the novel ground force with distinct information-based speciality, development Information weapon equipment explores the information-based mode of operation, accelerates to promote the joint fighting capability based on network information system.When Under precondition, to accelerate to promote ground force's construction transition, to win the combined operation based on network information system in the future, army is outstanding It is that the actual combatization rehearsal that ground force carries out under Resisting Condition is only way.
Currently, the main means that all armies in the world carries out simulated training substantially use training analog system (device Material).Matter emulation simulation, software analog simulation and HWIL simulation can be divided on the type of skill.But these three types of training moulds All there is biggish gap compared with true environment in quasi- system, the training environment constituted.Furthermore rivalry-drilling is substantially pressed Imagine the operation assumed, Campaign Process and fight capability according to preparatory and carry out, commander at different levels, operation portion (divide) team can not Sense of reality is to fight opponents and its due flexibility.In addition, my army drills movable required precision to actual combatization and increasingly mentions Height, warfighting capabilities can most be examined by carrying out rivalry-drilling on the spot in predetermined area of operations, specific environment and blue party, but by it is objective because Element influences locally or approximately simulate the confrontation of predetermined area of operations, specific environment and blue party, and rehearsal effect is difficult to reach It arrives or is fought close to the actual combat of real meaning.When actual combatization rehearsal carries out the assessment of operation damage effects, because not ruining actually Hurt target, army usually utilizes firing data when carrying out the assessment of practical operation damage effects, and in conjunction with personal experience, comparison divides Analyse and be derived from the damage effects assessment result of army's rehearsal.Blue party under Complex Battlefield Environments can not be intuitively presented in the way Actually injure state, lack image displaying process and rigorous computation model, do not form strong data and vision branch Support.Actual combatization rehearsal effect is had a greatly reduced quality.
The characteristics of virtual reality fusion that augmented reality has by it, real-time, interactive, three-dimensional track and localization, theoretically can The essential characteristic of enough 100% ground reflection real worlds, but can by various information elements with virtual mode seamless matching, thus Construction surmounts " super true " simulated training environment of real world.Therefore " fidelity " in training simulation field can effectively be solved Problem provides feasible solution for realization, raising, the actual combatization rehearsal reinforced and improved under Resisting Condition.
Summary of the invention
The purpose of the invention is to overcome the defect of prior art, in order to solve exist in military training simulation field The low problem of training environment fidelity, propose a kind of generation method of augmented reality Training Outdoor Military simulating scenes, it is intended to The outdoor military simulation-based training of virtual reality fusion is provided for army, provides feasible technical support to improve actual combatization rehearsal level And solution.
The method of the present invention is achieved through the following technical solutions:
A kind of method of the generation of augmented reality Training Outdoor Military simulating scenes, basic process are as follows:
Step 1: constructing the three-dimensional point cloud map of true environment;
Step 2: three-dimensional point cloud map use triangle gridding algorithm is generated tri patch cartographic model;
Step 3: by tri patch cartographic model import 3 d rendering engine in, and be arranged collision body for collide inspection It surveys;
Step 4: according to given video camera GPS longitude and latitude and elevation information
(Lungcam,Latcam,Zcam)TAnd the GPS longitude and latitude and elevation information of Military Simulation target
(Lungtar,Lattar,Ztar)T, calculate seat of the Military Simulation target under 3 d rendering engine world coordinate system Mark Xu, Yu, Zu, and add it in three-dimensional rendering scene;
Step 5: the video image that current camera obtains is read 3 d rendering engine, and it is arranged to background;
Step 6: calculating six-degree-of-freedom posture information of the video camera relative to real scene in real time;
Step 7: the six-degree-of-freedom posture information that step 6 calculates is assigned to the virtual camera in 3 d rendering engine;
Step 8: display apparatus and 3 d rendering engine are realized the connection of interface, and the actual situation of final rendering is melted Closing scenic picture and the transmission of display screen data can be completed virtual reality fusion scene rendering and shows to display apparatus.
Further, step 4 of the present invention calculates the coordinate X according to formula (1)u, Yu, Zu
Wherein, longitude conversion of the Lung between two GPS coordinates (GPS coordinate of real camera and Military Simulation target) Coefficient, latitude conversion coefficient of the Lat between two GPS coordinates (GPS coordinate of real camera and Military Simulation target).
Further, step 1 of the present invention constructs three-dimensional point cloud map using half dense vision SLAM method, described Step 6 calculates the six-degree-of-freedom posture information using half dense vision SLAM method.
Beneficial effect
The method of the present invention compares prior art, by calculating Military Simulation target in 3 d rendering engine world coordinate system Under coordinate, and carry out in 3 d rendering engine the addition of Military Simulation target, can be realized virtual military target and true The fusion of scene is shown and real-time, interactive, can effectively solve the problems, such as the fidelity of military training simulation field.
Detailed description of the invention
Fig. 1 is the system flow schematic diagram of embodiment of the present invention;
Transformation relation figure of the Fig. 2 between each coordinate system.
Specific embodiment
It elaborates with reference to the accompanying drawing to the embodiment of the method for the present invention.
A kind of generation method of augmented reality Training Outdoor Military simulating scenes, as shown in Figure 1, its specific steps includes:
Step 1: using LSD SLAM (large scale direct simultaneous localization and Mapping, extensive map structuring and real-time tracking position) system constructs half dense three-dimensional point cloud map of true environment, And it stores.
Step 2: will be walked using the greedy projection triangle gridding algorithm in the library PCL (point cloud library) The rapid one half dense three-dimensional point cloud map obtained generates tri patch cartographic model.
Step 3: the tri patch cartographic model that step 2 is obtained is by the import feature of Unity3D or by map Model is directly dragged in the modes such as Unity3D and is imported in Unity3D rendering engine, and adds collision body in Unity3D for model Component.
Step 4: according to the GPS latitude and longitude information of the Military Simulation target of setting, in the opposite of Unity3D virtual scene Rendering generates corresponding Military Simulation target in position.Specific step is as follows:
(1) each coordinate system for calculating Military Simulation target in Unity3D scene is defined.As shown in Fig. 2, Including real scene coordinate system, GPS coordinate system, Unity3D coordinate system and model local Coordinate System.1. GPS coordinate system: army Thing simulation objectives coordinate in GPS coordinate system is (Lungtar,Lattar,Ztar)T, coordinate of the video camera in GPS coordinate system be (Lungcam,Latcam,Zcam)T, in the coordinate of GPS coordinate system, longitude and latitude is as unit of spending, and elevation is as unit of rice.2. true Real field scape coordinate system: setting video camera in the initial position of real world as world coordinate system origin, the coordinate of world coordinate system Axis and the reference axis of GPS coordinate system are consistent, and Military Simulation target is (X in real scene world coordinate system coordinatew,Yw, Zw)T, using rice as coordinate unit.3. Unity3D coordinate system: Military Simulation target is (X in the coordinate of Unity3D coordinate systemu, Yu,Zu)T, using rice as coordinate unit.4. model local Coordinate System: coordinate of the Military Simulation target in its own coordinate system is (Xself,Yself,Zself)T, using rice as coordinate unit.
(2) conversion coefficient that longitude and latitude is calculated using the longitude and latitude range formula of point-to-point transmission, for calculating Military Simulation mesh The coordinate being marked under real scene coordinate system.If longitude and latitude of first, the second two o'clock in real scene is respectively (Lung1, Lat1) (Lung2, Lat2), then longitude conversion coefficient is calculated using formula (1) and is obtained:
In formula (1), arcsin () is arcsin function;The unit of longitude and latitude need to switch to radian by degree;L indicates the earth Radius, unit is kilometer, is indicated with constant 6378.173.
Latitude conversion coefficient is calculated by formula (2) and is obtained:
(3) longitude and latitude of military simulation objectives itself and elevation information are transformed to using formula (3) through rotation and translation Under world coordinate system, the positional relationship that video camera, Military Simulation target are described under the same coordinate system is realized.
In formula (3), the orthogonal spin matrix of unit that R is 3 × 3, the translation matrix that T is 3 × 1.In formula (4), translation The X of matrix T, Y-component are respectively by the longitude and latitude conversion coefficient of formula (1) and formula (2) multiplied by Military Simulation target and true field The difference of the longitude and latitude of scape coordinate origin is calculated, and Z component is Military Simulation target and real scene coordinate origin elevation Difference.
(4) formula (5) are utilized, to Unity3D coordinate system, to be used for by real scene coordinate system transformation military simulation objectives It loaded in Unity3D engine, render Military Simulation object model.
When not considering the direction (i.e. R be unit matrix) of model, and Military Simulation target is set in its own coordinate system Coordinate (Xself,Yself,Zself)T=(0,0,0)T, Military Simulation target can be obtained in conjunction with formula (1)-(5) and sat in the world Unity3D Coordinate (X in mark systemu,Yu,Zu)T
Finally, corresponding things object module is loaded and is rendered according to the coordinate for the Military Simulation target being calculated Corresponding position in Unity3D scene.
Step 5: the video image that current camera captures is read in Unity3D 3 d rendering engine, and it is arranged At background.Specific step is as follows:
(1) it realizes being registrated for real camera and virtual camera inner parameter, observes scenic picture both to reach Unanimously.According to the model parameter of used video camera, the respective inner parameter of virtual camera in Unity3D scene is carried out Identical setting.
(2) video image of real camera is utilized to WebCamTexture () function or OpenCV of Unity3D VideoCapture () function transfer of for Unity plug-in unit is into Unity3D, then by the video image of video camera with textures Mode render in the plane set, thus in virtual camera view it can be observed that real scene video flowing with The fused picture of Military Simulation object module realizes the real-time synchronization of virtual camera and real camera observation picture.
Believe Step 6: calculating video camera in real time using LSD-SLAM algorithm relative to the six-degree-of-freedom posture of real scene Breath, the specific steps are as follows:
(1) it is saved when constructing cartographic model with step 1 by the first frame image of video camera captured in real-time most like Key frame does Feature Points Matching, calculates pose of the tracking camera of initial position relative to cartographic model coordinate system.It uses again The ratio of the depth that the key frame depth maps characteristic point depth of preservation and characteristic matching corresponding points trigonometric ratio restore corrects pose Translation data.
(2) the front end tracking module and rear end figure optimization module of LSD-SLAM algorithm, available present frame figure are called As all key frames after the six-freedom degree pose relative to reference frame (key frame nearest from present frame), and optimization Between pose, transmitted by pose, the globally consistent real-time video camera under LSD-SLAM system scale can be obtained Six-degree-of-freedom posture information relative to tracking camera initial position.
(3) manual identification is placed by the known location in real scene, by clicking on manual identification by hand Characteristic point is registrated with 3D point corresponding in LSD-SLAM map, realization.Under on the basis of by LSD-SLAM system coordinate system, 3D coordinate of the characteristic point on the manual identification of picking under real scene scale can be obtained by position sensor, then is led to It is available to cross corresponding 3D point coordinate, the ratio of final two 3D coordinates in the map that practical SLAM system is run Dimensional information is multiplied in obtained translation data, can obtain reality by the dimensional information of SLAM system and real scene When six-degree-of-freedom posture information of the video camera relative to real scene.
Step 7: the real-time video camera that step 6 is calculated is assigned relative to the six-degree-of-freedom posture information of real scene It is worth to the virtual camera machine in Unity3D rendering engine, realizes that virtual camera is synchronous with the pose of real camera.Tool Steps are as follows for body:
(1) initial attitude of virtual camera is determined.By step 4 it is found that the initial position of video camera is that coordinate system is former Point, but video camera can not be determined in the posture of real scene world coordinate system, therefore obtain camera shooting using Inertial Measurement Unit The initial attitude data (course angle, pitch angle and roll angle) of machine recycle serial ports to pass data in Unity3D, are used in combination Script assigns data to virtual camera.
(2) the pose data that LSD-SLAM algorithm calculates in real time are transmitted to Unity3D by way of dynamic link library In.In pose data after the conversion of the coordinate system as shown in formula (7) and (8), pose data are assigned virtually using script Video camera is achieved in the pose real-time synchronization of virtual camera and real camera.
Rcam=Rori(TulRslamTul) (7)
Tcam=Rori(TulTslam) (8)
In formula (7)-(8),For by Rslam、TslamBy being transformed into right-handed coordinate system In left-handed coordinate system;RoriThe orthogonal rotation of 3 × 3 units for the video camera initial attitude data conversion obtained in step 7 (1) Matrix;Rslam、TslamRespectively the orthogonal rotation of 3 × 3 unit in six-degree-of-freedom posture information is calculated in LSD-SLAM algorithm Torque battle array, 3 × 1 translation matrix;Rcam、TcamFor Rslam、TslamThe 3 × 3 of virtual camera is assigned after coordinate system is converted The orthogonal spin matrix of unit, 3 × 1 translation matrix.
Step 8: display apparatus and 3 d rendering engine are realized the connection of interface, and the actual situation of final rendering is melted Closing scenic picture and the transmission of display screen data can be completed virtual reality fusion scene rendering and shows to display apparatus.
Since then, just complete/realize the generation method of augmented reality Training Outdoor Military simulating scenes.
In conclusion the above is merely preferred embodiments of the present invention, being not intended to limit protection model of the invention It encloses.All within the spirits and principles of the present invention, any modification, equivalent replacement, improvement and so on should be included in this hair Within bright protection scope.

Claims (3)

1. a kind of method of the generation of augmented reality Training Outdoor Military simulating scenes, which is characterized in that process is as follows:
Step 1: constructing the three-dimensional point cloud map of true environment;
Step 2: three-dimensional point cloud map use triangle gridding algorithm is generated tri patch cartographic model;
Step 3: tri patch cartographic model is imported in 3 d rendering engine, and collision body is set for collision detection;
Step 4: according to given video camera GPS longitude and latitude and elevation information and the GPS longitude and latitude and height of Military Simulation target Journey information calculates coordinate X of the Military Simulation target under 3 d rendering engine world coordinate systemu,Yu,Zu, and add it to In three-dimensional rendering scene;
Step 5: the video image that current camera obtains is read 3 d rendering engine, and it is arranged to background;
Step 6: calculating six-degree-of-freedom posture information of the video camera relative to real scene in real time;
Step 7: the six-degree-of-freedom posture information that step 6 calculates is assigned to the virtual camera in 3 d rendering engine;
Step 8: display apparatus and 3 d rendering engine to be realized to the connection of interface, and the virtual reality fusion field of final rendering Scape picture and the transmission of display screen data can be completed virtual reality fusion scene rendering and show to display apparatus.
2. the method for the generation of augmented reality Training Outdoor Military simulating scenes according to claim 1, which is characterized in that institute It states step 4 and the coordinate X is calculated according to formula (1)u,Yu,Zu
Wherein, longitude conversion coefficient of the Lung between two GPS coordinates, latitude conversion coefficient of the Lat between two GPS coordinates, (Lungcam,Latcam,Zcam)TFor video camera GPS longitude and latitude and elevation information, (Lungtar,Lattar,Ztar)TFor Military Simulation mesh Target GPS longitude and latitude and elevation information.
3. the method for the generation of augmented reality Training Outdoor Military simulating scenes according to claim 1, which is characterized in that institute Step 1 is stated using half dense vision SLAM method building three-dimensional point cloud map, the step 6 uses the half dense vision side SLAM Method calculates the six-degree-of-freedom posture information.
CN201811256948.8A 2018-10-26 2018-10-26 A kind of method of Outdoor Augmented Reality military simulation-based training Pending CN109377560A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811256948.8A CN109377560A (en) 2018-10-26 2018-10-26 A kind of method of Outdoor Augmented Reality military simulation-based training

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811256948.8A CN109377560A (en) 2018-10-26 2018-10-26 A kind of method of Outdoor Augmented Reality military simulation-based training

Publications (1)

Publication Number Publication Date
CN109377560A true CN109377560A (en) 2019-02-22

Family

ID=65389982

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811256948.8A Pending CN109377560A (en) 2018-10-26 2018-10-26 A kind of method of Outdoor Augmented Reality military simulation-based training

Country Status (1)

Country Link
CN (1) CN109377560A (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109887103A (en) * 2019-03-21 2019-06-14 威创集团股份有限公司 A kind of Unity3D positioning points distributing method, device, equipment and computer readable storage medium
CN110021210A (en) * 2019-03-26 2019-07-16 江苏航空职业技术学院 A kind of unmanned plane VR training method with scalability Virtual Space
CN110111428A (en) * 2019-05-28 2019-08-09 艾瑞迈迪科技石家庄有限公司 A kind of virtual target scaling method and device applied to augmented reality
CN110262283A (en) * 2019-06-11 2019-09-20 远形时空科技(北京)有限公司 A kind of the vision robot's emulation platform and method of more scenes
CN110321000A (en) * 2019-04-25 2019-10-11 南开大学 A kind of dummy emulation system towards intelligence system complex task
CN110910484A (en) * 2019-12-03 2020-03-24 上海世长信息科技有限公司 SLAM-based object mapping method from two-dimensional image to three-dimensional real scene
CN111696216A (en) * 2020-06-16 2020-09-22 浙江大华技术股份有限公司 Three-dimensional augmented reality panorama fusion method and system
CN113961068A (en) * 2021-09-29 2022-01-21 北京理工大学 Close-distance real object eye movement interaction method based on augmented reality helmet
CN114419293A (en) * 2022-01-26 2022-04-29 广州鼎飞航空科技有限公司 Augmented reality data processing method, device and equipment
CN115082648A (en) * 2022-08-23 2022-09-20 海看网络科技(山东)股份有限公司 AR scene arrangement method and system based on marker model binding

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103150658A (en) * 2013-04-02 2013-06-12 武汉友睿科技有限公司 Augmented reality (AR) customizing system and method facing to terminal user
US20180247456A1 (en) * 2017-02-27 2018-08-30 Hiscene (Shanghai) Information Technology Co., Ltd. Method and device for augmented reality display of real physical model
CN108648270A (en) * 2018-05-12 2018-10-12 西北工业大学 Unmanned plane real-time three-dimensional scene reconstruction method based on EG-SLAM

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103150658A (en) * 2013-04-02 2013-06-12 武汉友睿科技有限公司 Augmented reality (AR) customizing system and method facing to terminal user
US20180247456A1 (en) * 2017-02-27 2018-08-30 Hiscene (Shanghai) Information Technology Co., Ltd. Method and device for augmented reality display of real physical model
CN108648270A (en) * 2018-05-12 2018-10-12 西北工业大学 Unmanned plane real-time three-dimensional scene reconstruction method based on EG-SLAM

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
曲毅: "增强现实地理信息系统跟踪注册技术研究", 《中国优秀硕士学位论文全文数据库-基础科学辑》 *

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109887103A (en) * 2019-03-21 2019-06-14 威创集团股份有限公司 A kind of Unity3D positioning points distributing method, device, equipment and computer readable storage medium
CN110021210A (en) * 2019-03-26 2019-07-16 江苏航空职业技术学院 A kind of unmanned plane VR training method with scalability Virtual Space
CN110321000A (en) * 2019-04-25 2019-10-11 南开大学 A kind of dummy emulation system towards intelligence system complex task
CN110321000B (en) * 2019-04-25 2022-12-23 南开大学 Virtual simulation system for complex tasks of intelligent system
CN110111428A (en) * 2019-05-28 2019-08-09 艾瑞迈迪科技石家庄有限公司 A kind of virtual target scaling method and device applied to augmented reality
CN110111428B (en) * 2019-05-28 2023-06-20 艾瑞迈迪科技石家庄有限公司 Virtual target calibration method and device applied to augmented reality
CN110262283B (en) * 2019-06-11 2022-08-23 远形时空科技(北京)有限公司 Multi-scene visual robot simulation platform and method
CN110262283A (en) * 2019-06-11 2019-09-20 远形时空科技(北京)有限公司 A kind of the vision robot's emulation platform and method of more scenes
CN110910484A (en) * 2019-12-03 2020-03-24 上海世长信息科技有限公司 SLAM-based object mapping method from two-dimensional image to three-dimensional real scene
CN111696216A (en) * 2020-06-16 2020-09-22 浙江大华技术股份有限公司 Three-dimensional augmented reality panorama fusion method and system
CN111696216B (en) * 2020-06-16 2023-10-03 浙江大华技术股份有限公司 Three-dimensional augmented reality panorama fusion method and system
CN113961068A (en) * 2021-09-29 2022-01-21 北京理工大学 Close-distance real object eye movement interaction method based on augmented reality helmet
CN113961068B (en) * 2021-09-29 2023-01-06 北京理工大学 Close-range real object eye movement interaction method based on augmented reality helmet
CN114419293A (en) * 2022-01-26 2022-04-29 广州鼎飞航空科技有限公司 Augmented reality data processing method, device and equipment
CN115082648A (en) * 2022-08-23 2022-09-20 海看网络科技(山东)股份有限公司 AR scene arrangement method and system based on marker model binding

Similar Documents

Publication Publication Date Title
CN109377560A (en) A kind of method of Outdoor Augmented Reality military simulation-based training
US9892563B2 (en) System and method for generating a mixed reality environment
CN109359405B (en) Air-space-ground integrated big data battlefield environment semi-physical simulation system
US20190200003A1 (en) System and method for 3d space-dimension based image processing
KR101229283B1 (en) Method and system for visualising virtual three-dimensional objects
CN102735100B (en) Individual light weapon shooting training method and system by using augmented reality technology
JP4804256B2 (en) Information processing method
CN105212418A (en) Augmented reality intelligent helmet based on infrared night viewing function is developed
CN108170297B (en) Real-time six-degree-of-freedom VR/AR/MR device positioning method
CN105617658A (en) Multiplayer moving shooting game system based on real indoor environment
CN109636916A (en) A kind of a wide range of virtual reality roaming system and method for dynamic calibration
WO2013111146A2 (en) System and method of providing virtual human on human combat training operations
CN106408515A (en) Augmented reality-based vision synthesis system
WO2017065348A1 (en) Collaboration method using head mounted display
CN109725733A (en) Human-computer interaction method and human-computer interaction equipment based on augmented reality
Oskiper et al. Augmented reality binoculars
CN109166181A (en) A kind of mixing motion capture system based on deep learning
CN106251282A (en) A kind of generation method and device of mechanical arm sampling environment analogous diagram
CN108109460A (en) Equipment is visited in a kind of teaching with augmented reality chemical plant
Azuma et al. Performance analysis of an outdoor augmented reality tracking system that relies upon a few mobile beacons
US9646417B1 (en) Augmented reality system for field training
Zhu et al. AR-Weapon: live augmented reality based first-person shooting system
CN116661334B (en) Missile tracking target semi-physical simulation platform verification method based on CCD camera
CN112927356A (en) Three-dimensional display method for unmanned aerial vehicle image
CN116558360A (en) Shooting simulation training method and system based on moving carrier

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20190222