CN110365666A - Multiterminal fusion collaboration command system of the military field based on augmented reality - Google Patents

Multiterminal fusion collaboration command system of the military field based on augmented reality Download PDF

Info

Publication number
CN110365666A
CN110365666A CN201910586319.XA CN201910586319A CN110365666A CN 110365666 A CN110365666 A CN 110365666A CN 201910586319 A CN201910586319 A CN 201910586319A CN 110365666 A CN110365666 A CN 110365666A
Authority
CN
China
Prior art keywords
target
battlefield
collaboration
people
virtual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910586319.XA
Other languages
Chinese (zh)
Other versions
CN110365666B (en
Inventor
栾明君
洪岩
栾凯
卞强
宁阳
陈艳
孟德地
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CETC 15 Research Institute
Original Assignee
CETC 15 Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CETC 15 Research Institute filed Critical CETC 15 Research Institute
Priority to CN201910586319.XA priority Critical patent/CN110365666B/en
Publication of CN110365666A publication Critical patent/CN110365666A/en
Application granted granted Critical
Publication of CN110365666B publication Critical patent/CN110365666B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/02Protocols based on web technology, e.g. hypertext transfer protocol [HTTP]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/131Protocols for games, networked simulations or virtual reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/398Synchronisation thereof; Control thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Abstract

A kind of multiterminal fusion collaboration command system the invention discloses military field based on augmented reality, including user authority management, the access of battlefield information source, Combat Command System integrated framework, exterior content load, Virtual Battlefield spatial modeling, Virtual Battlefield space shows, cooperation interaction common support, people-shadow picture collaboration, people-target cooperative, people-ten parts of target interaction, for traditional strange land command and coordination mode low efficiency in existing military field, real-time is poor, it can not intuitively exchange, the problems such as heavy dependence mouse-keyboard, a kind of commander's environment is provided for commandings at different levels, the multiple commandings for ensuring to be in strange land can be based in same Virtual Battlefield space, it is connect in conjunction with to drawing for battlefield various information resource, it is interactive by everybody the collaboration/personage for implementing efficient base, carry out situation plotting, simulated maneuver, information collects, want feelings It all kinds of commander's links such as studies and judges, greatly improves collaboration commander's efficiency.

Description

Multiterminal fusion collaboration command system of the military field based on augmented reality
Technical field
The present invention relates to computer software fields more particularly to a kind of military field to merge association based on the multiterminal of augmented reality Same command system.
Background technique
Augmented reality is a kind of by " seamless " the integrated new technology of real world information and virtual world information, is originally Very unobtainable entity information (visual information, sound etc.), passes through computer in the certain time spatial dimension of real world It is superimposed, virtual Information application to real world is perceived by human sensory, to reach exceeding reality Deng science and technology Sensory experience.Augmented reality is true environment and virtual object the has been added in real time same picture or space, together When show, and two kinds of information are complementary to one another, are superimposed.Augmented reality not only shows the information of real world, also can Enough show the virtual information of computer generation, two kinds of information complement each other, are superposed to a picture, and Seamless integration- is in a sky Between.
Currently the existing collaboration command mode of my army is limited by space-time, distance, and it is white to be all based on electronics between command centres at different levels Traditional cooperative mode such as plate, audio/video conference, Collaborative Plotting based on GIS-Geographic Information System is carried out integrated command, is jointly controlled Deng commander's work, there are collaborative processes to be difficult to development in real time, commander perceives situation of battlefield and relies on mouse-keyboard interactivity difference etc. Problem.
In augmented reality equipment aspect, both at home and abroad with Microsoft, Magic Leap, Intel RealSense, Hewlett-Packard, HTC etc. It is proposed glasses device one after another for the enterprise of representative, the camera on cooperating equipment can identify user gesture, Jin Erti For the scene interactivity ability based on gesture, the positioning to target is realized based on focus tracer technique.
In terms of portrait fusion, provided virtually using Occipital Structure, cat owl visual field as the manufacturer of representative Portrait projects solution in scene, but major part can only accomplish effigy at present, and few manufacturer is able to achieve holography true to nature Three-dimensional true man's portrait drop shadow effect (hologram three-dimensional true man's portrait drop shadow effect has important promotion to the experience of multi-person synergy).
In terms of target positioning, the solution provided on the market at present is largely to rely on focus tracking to be positioned (such as Microsoft Hololens), this positioning method is inflexible for use, has higher requirements to position, the tilt angle etc. on head, And operating time length will lead to very strong feeling of fatigue, not be suitable for complicated business scene;In terms of gesture, interactive voice, at present Technological difficulties focus primarily upon to user gesture carry out identification precision it is not high, most of solution is only able to achieve to a small amount of The identification (such as Microsoft Hololens can only be to identification of two simple gestures) of simple gesture.
In terms of multi-person synergy, most of research all concentrates on how realizing clearly portrait throwing in augmented reality scene Shadow, and more human world cooperation interactions still belong to blank field in scene, the solution of current maturation not yet.
By to mainstream augmented reality interaction solution on Vehicles Collected from Market, and the related augmented reality interaction delivered From the point of view of paper and patent, for military field distributed collaborative commander demand there are also a certain distance.It is embodied in: The application in operational coordination commander field is especially emphasized and the combination of battlefield surroundings, shows including the panorama to battlefield surroundings, enemy and we The quick positioning and control of target transmit quick between commandings at different levels of being switched fast of scene of battle field, information, are at different levels The facial expressions and acts of commanding are precisely grasped, the real-time access of all kinds of battlefield information sources and combination commander's movement effectively show, are based on Battlefield surroundings carry out situation plotting and operation and the various operations such as study and judge, and transmit to response speed, target location accuracy, information real Shi Xing, etc. all kinds of indexs have a strict requirements, and current all applications based on augmented reality are not able to satisfy military field Cooperate with the requirement of commander.
Summary of the invention
In view of the deficiencies of the prior art, the present invention proposes a set of, and the multiterminal fusion based on augmented reality cooperates with command system, It focuses integrated command, the cooperation interactions core demand such as jointly control, provide a kind of commander's environment for commandings at different levels, it is ensured that body The multiple commandings for locating strange land can connect based in same Virtual Battlefield space in conjunction with to drawing for battlefield various information resource, lead to Everybody collaboration/personage's interaction for implementing efficient base is crossed, carries out that situation plotting, simulated maneuver, information collect, to want feelings to study and judge etc. each Class commands link, greatly improves collaboration commander's efficiency.
In order to solve the above-mentioned technical problem, the present invention passes through following technical proposals: military field is based on the more of augmented reality End fusion collaboration command system, including user authority management, the access of battlefield information source, Combat Command System integrated framework, exterior content add Load, Virtual Battlefield spatial modeling, Virtual Battlefield space shows, cooperation interaction common support, people-shadow picture cooperate with, people-target association Same, people-ten parts of target interaction, the user authority management includes room management, personal management, rights management, collaborative invitation Function, by the personal management safeguard institute virtual scene to be added in participate in cooperate with user information, pass through the room Management can be created the collaboration spaces in a virtual scene by promoter, can select personnel by promoter by the collaborative invitation Collaboration spaces are added, by everyone collaboration permission in the settable collaboration spaces of the rights management, commanding is being accessed Before virtual battlefield environment, needs to carry out login authentication by fingerprint, pin mode, void just only can enter by the user of verifying Quasi- battlefield space, and use correlation function;The battlefield information source access is provided in a manner of actively or passively to fixed environment, war Drawing for land, sea, air, outer space plurality of information resources connects support under art environment, after Machining Analysis capable of being carried out by information source is got in real time, conversion The battlefield situation information and target bind of support are provided for that can command for collaboration, or concentrates and shows directly in information source window; The Combat Command System integrated framework offer integrates Combat Command System, support Web class Combat Command System application/RPC, gRPC, Webservice, Restful class accuse the integrated of function services, and in conjunction with the people-target interactive function, realization is virtually being fought Commanding operates Combat Command System in the space of field, real-time updating system feedback result;The exterior content load refers to By opening properties window in Virtual Battlefield space designated position, the file in periodic polling local storage space is simultaneously virtually being fought Load refreshes in the properties window of environment, and realizes page turning to content in conjunction with the people-target interaction capabilities, clicks behaviour Make, the load for providing multiple format content is shown, the load to video flowing is also supported to show;The Virtual Battlefield spatial modeling is Refer to combining geographic information system and modeling technique, all kinds of targets, landform, vegetation, tree modelling are built by modeling process, are based on After augmented reality equipment load operating, true to nature reduction of the global virtual battlefield environment realization to battlefield surroundings is generated, it can be from difference Angle checks battlefield surroundings and enemy and we's target state;The Virtual Battlefield space representation provides a variety of browsing modes, institute Having the feedback of these browsing modes all can be presented to commanding by augmented reality equipment, facilitate commanding from different perspectives Check battlefield space;The cooperation interaction common support provides synchronous clock, coordinate calculating, instruction collaboration, state collaboration function Can, the clock, which is synchronized, supports that the coordinate is calculated to holography three between collaboration, user user with system when cooperateing with offer of content It ties up coordinate setting of the portrait in Virtual Space and basic algorithm support is provided, described instruction collaboration is the operation of user gesture conversion It instructs to distribute to each glasses device and support is provided, state collaboration is the dbjective state that changes in virtual scene to each A glasses device distribution provides support;The people-shadow picture collaboration provides character features extraction, the fusion of multichannel portrait, voice Noise reduction, audio-video synchronizing function, the character features extraction is by one group of depth camera from multiple angle acquisition portraits, described The fusion of multichannel portrait is to extract coherent video parameter to the portrait of different angle acquisition, to form complete holography by splicing and combining Three-dimensional portrait parameter, the voice de-noising are to carry out noise reduction process to voice, and the phonotape and videotape, which synchronizes, to be acquired to synchronization Portrait and voice synchronize processing, all glasses devices are distributed under the premise of uniform time reference, it is ensured that everyone sees It can also hear the voice of other side simultaneously to other people;People-the target cooperative provides dbjective state feedback, dbjective state Acquisition, dbjective state distribution, dbjective state synchronizing function, the dbjective state feedback is that object module is made according to interactive instruction Different reactions, driving target position, shape, color, state, content also change correspondingly, and the dbjective state acquisition is to obtain The change of dbjective state extracts the parameter value specifically changed, and the dbjective state distribution is to initiate artificial reference, by target The change of state is distributed to proprietary augmented reality equipment, and it is that will be distributed to commanding's enhancing now that the dbjective state, which synchronizes, The dbjective state of real equipment is compared and verifies with the change of local dbjective state, as inconsistent in found, by being deployed in enhancing Cooperative client in real world devices ensures the consistency that dbjective state is shown in all augmented reality equipment after being handled;Institute People-target interaction is stated gesture feature extraction is provided, gesture motion analysis, interactive instruction conversion, is directed toward target positioning, gesture instruction It obtains, phonetic order acquisition, command synchronization distribution, instruction action callback function, the gesture feature extraction is from ancillary equipment Direction, the station-keeping data of the current gesture of commanding are perceived, the gesture motion analysis is to extract one section from ancillary equipment The sequence of gesture in time, interactive instruction conversion is analyzed gesture sequence, and predetermined Gesture motion is compared and is converted to operational order, the direction target positioning be according to gestures direction, station-keeping data, Targeted coordinate, position are differentiated in conjunction with the coordinate-system of augmented reality scene, and the gesture instruction obtains and the voice It is when user issues instruction with gesture or voice that instruction, which obtains, and triggering command interface call operation obtains currently according to hand Instruction after gesture or speech recognition conversion, described instruction synchronization distribution is to set interactive instruction synchronization distribution to other augmented realities Standby, described instruction movement readjustment is to establish interaction channel with operation target after receiving instruction and trigger the callback method of target, So that target executes feedback action.
In above scheme, it is preferred that it is described collaboration permission include to user whether can access of virtual battlefield space restriction, To user it can be seen that designated user's image and the restriction cooperateed with therewith, can see void to user in Virtual Battlefield space Which target and the restriction interacted therewith in quasi- battlefield space.
In above scheme, it is preferred that the battlefield information source access connects all kinds of battlefield information sources of my army's active service including drawing, and can To be mounted automatically with Research on Target, is concentrated by theme and show, freely recall the various ways such as particular items and be shown.
In above scheme, it is preferred that the Combat Command System integrated framework provide to Web class Combat Command System application/RPC, GRPC, Webservice, Restful class accuse the integrated of function services, and commanding can be in Virtual Battlefield space to integrated Combat Command System operated into selection, click, double-click, dragging, drop-down etc., the real-time updating system of Combat Command System integrated framework is anti- Present result.
In above scheme, it is preferred that it includes outside that the exterior content load, which provides the multiple format content that load is shown, Video, picture, text, PPT, Word, Excel, and can show that content carries out the page turning of basic content, clicks behaviour to load Make.
In above scheme, it is preferred that the Virtual Battlefield spatial modeling combining geographic information system and modeling technique, creation Global virtual battlefield environment can be realized the reduction true to nature to practical battlefield surroundings.
In above scheme, it is preferred that a variety of browsing modes that the Virtual Battlefield space representation provides include hawkeye, overflow Trip, particular task region switch in and out, target is amplified and reduction.
In above scheme, it is preferred that the people-people's target interaction is realized the perception for being directed toward target gesture to user, and tied It closes coordinate system setting and gesture is directed toward, realize to target, content impression window, integrated Combat Command System window in virtual battlefield environment Carry out precise positioning.
In above scheme, it is preferred that the people-people's target interaction, which is realized, establishes gesture/phonetic order and scene objects User gesture is converted to manipulation instruction and is sent to user's choosing by instruction path on the basis of realizing to targeting accuracy positioning Fixed target, and trigger object feedback movement.
In above scheme, it is preferred that the people-people's target interaction, which is realized, acts energy by the object feedback that manipulation instruction triggers By the synchronous perception of commanders all in virtual battlefield environment.
In above scheme, it is preferred that the people-shadow is adopted from multiple angles as cooperative achievement by one group of depth camera Collect portrait, and coherent video parameter is extracted to the portrait of different angle acquisition, to form complete hologram three-dimensional by splicing and combining Portrait parameter.
In above scheme, it is preferred that the people-shadow is as cooperative achievement two or more commanders are in void Holographic portrait is projected in quasi- battleficld command environment, everyone can see complete Virtual Battlefield by wearing terminal presentation facility Environment is commanded, and everyone facial characteristics, movement, posture, sound can be synchronized perception by other people.
In above scheme, it is preferred that the people-shadow is as completing institute in virtual scene under the same time reference of cooperative achievement There are the body image of user and the synchronous acquisition of speech audio, institute in virtual scene is distributed to by different transport channel synchronizations On the glasses device and audio-frequence player device for thering is user to wear, realize what movement, expression and the synchronous sound of body image showed Effect.
The present invention is poor for traditional strange land command and coordination mode low efficiency, real-time in existing military field, can not be intuitive The problems such as exchange, heavy dependence mouse-keyboard, proposes a kind of multiterminal fusion collaboration commander system of the military field based on augmented reality System, has the feature that
(1) the virtual battlefield environment building based on geography information
Based on geography information Various types of data construct Virtual Battlefield model, including land ocean, desert steppe, islands and reefs island, All kinds of geographic elements such as scenery with hills and waters river, trees shrub, Yi Jidi, I, the portion that friendly troop is belligerent and support power is in the space of battlefield Situation is affixed one's name to, the important goals such as cannon, tank, vehicle, naval vessel, aircraft etc. be all kinds of;After commanders at different levels wear augmented reality equipment, It can be connected by certification into virtual battlefield environment.
(2) scene of battle field is switched fast
When in face of macroscopical battlefield surroundings, hawkeye is provided, full figure roams, battlefield universe is looked down, specific region incision, specific mesh The several scenes switch means such as mark focusing, while GIS-Geographic Information System window can be also accessed in virtual battlefield environment, facilitate finger It waves personnel and checks battlefield surroundings in such a way that multiple means combine.
(3) drawing for all kinds of battle field informations connects
This patent supports all kinds of battlefield resource accesses, can draw and connect all kinds of battlefield information sources of my army's active service, and can be with automatic It mounted with Research on Target, concentrated by theme and show, freely recall the various ways such as particular items and be shown, can satisfy Commander's demand of commandings at different levels greatly improves collaboration commander's efficiency.
(4) all kinds of common command functions are incorporated
This patent support integrate all kinds of command functions, can based on virtual battlefield environment integrate situation plotting, comprehensive analysis, The operational commandings functions such as operation is prepared, document is received and dispatched;It can integrate and show digesting and editing of intelligence, battlefield investigation, operation document, operation side Each class text such as case, video, image, system interface content;Battlefield image data can be directly accessed and combine Virtual Battlefield ring Border synthesis shows.
(5) targeting accuracy in Virtual Battlefield can be positioned
The perception for being directed toward target gesture to user is realized using auxiliary awareness apparatus, and coordinate system setting and gesture is combined to refer to To realization shows content (video, the text of load to direction target (such as naval vessel, aircraft, tank, place where their troops are stationed etc.), window Sheet, image etc.), integrated all kinds of command system interfaces carry out precise positioning.
(6) it can be interacted with the target in virtual battlefield environment
The instruction path for establishing gesture/phonetic order and scene objects leads on the basis of realizing to targeting accuracy positioning It crosses and moves the feedback that the particular words that user says the operation of auxiliary awareness apparatus, user are converted into manipulation instruction triggering target Make, realization, which recalls target menu, selection menu item, point is specified to be clicked, double-click, click etc. to target or virtual scene hands over Mutually, it is ensured that user can interact with all kinds of targets in virtual scene, trigger itself controlled attribute of target, all pairs of void Target, which manipulates triggered feedback effects, in quasi- battleficld command environment to synchronize perception by commander.
(7) movement collaboration of the multi-user in virtual scene
Two or more commanders are supported to project holographic portrait in Virtual Battlefield commander's environment, everyone It can see that complete Virtual Battlefield commands environment, and everyone facial characteristics, movement, appearance by wearing terminal presentation facility State, sound can be synchronized perception by other people, using experience as carried out collaboration commander before the sand table in same place.
(8) voice collaboration of the multi-user in virtual scene
The synchronous acquisition of the body image of all users and speech audio in virtual scene is completed under same time reference, After the processing such as being compressed, being filtered to the audio of acquisition, together with the holographic three-dimensional image of human body for completing coordinate mapping processing, by not Same transport channel synchronization is distributed on all users wear in virtual scene glasses device and audio-frequence player device, realizes people The effect that movement, expression and the synchronous sound of body image show.The each user being in virtual scene both can be with other people It has conversation exchange, other people speech can be also heard, such as the face-to-face communication exchange of several people of real world.
Beneficial effects of the present invention are as follows:
A) important breakthrough is obtained in terms of people-object interaction integrated application and have practical reference value.In conjunction with gesture, voice Etc. a variety of interactive means, realize commandings at different levels to target each in augmented reality scene (including model, button, menu, load External document, video, system interface etc.) precise positioning, instruction distribution, interaction feedback, can be suitably used for combined operation collaboration Command most application scenarios;
B) more accurate target positioning means are released, are an innovations in augmented reality cooperation interaction field.This patent The accurate gesture cognition technology of proposed adoption realized based on auxiliary awareness apparatus, is supported in broader field angle, farther sighting distance Direction positioning operation is carried out in range, and precise positioning is carried out to the target of smaller size smaller;
C) more human world cooperation interactions based on hologram three-dimensional true human image, have filled up the one of augmented reality synergistic application field Item blank.The achievement of research and development extracts human body spy on the basis of carrying out portrait, sound collection to the multidigit user for being in strange land It levies data, depth data, color data, voice data and carries out space coordinate conversion and positioning, audio-video synchronous, actual situation scene fusion Etc. projecting in glasses device after a series of processing, realize the user in virtual scene can see between each other, hear other side, It swaps and cooperates in the scene with other side;
D) it on the basis of Virtual Battlefield space, provides and accusation information system function interface is integrated, exterior content is provided The load in source shows, integrates to what a variety of communication channels of operational environment interconnected, while supporting commanding in Virtual Battlefield Integrated content is manipulated in environment, a kind of new Combat Command System use pattern is provided for commandings at different levels, greatly changes It has been apt to commanding and has perceived the problems such as situation of battlefield dependence mouse-keyboard interactivity is poor.
Detailed description of the invention
Fig. 1 is that the present invention is based on Virtual Battlefield multi-person synergy interaction models design drawings.
Fig. 2 is cooperation interaction system function composite structural diagram of the present invention.
Fig. 3 is cooperation interaction internal system relationship of the present invention and calling logic figure.
Fig. 4 is cooperation interaction system implementation flow chart of the present invention.
Specific embodiment
It is further detailed in conjunction with attached drawing work in terms of modelling, composed structure two to technical solution of the present invention below Explanation is described.
(1) modelling
The modelling of this patent is as shown in Figure 1, include user gesture model, Virtual Battlefield authority models, user's portrait Model, Virtual Battlefield merge with portrait, multi-user collaborative model, interactive instruction model, battlefield dynamic object model, battlefield are static 11 partial contents such as object module, battlefield key object of reference model, battlefield terrain landform model, object feedback content model.
User gesture model is used to define user gesture, including gesture shape definition, gesture feature define, gesture profile is fixed Justice etc.;User gesture pointing direction is marked, that is, combines virtual scene projection object of reference and current user position, is converted by coordinate Analysis obtains orientation and direction of the user gesture in current virtual battlefield surroundings;The mapping for defining gesture and interactive instruction is closed System illustrates that user gesture matching degree reaches how many i.e. it is believed that user is intended to interact, and with the defined good interaction of triggering It instructs and distributes to specified target.
Virtual Battlefield authority models command level user for defining the permission that user enters Virtual Battlefield space, from which (Central Military Commission's connection refers to, region of war connection refers to, commanding agencies at different levels in services, services), which business domains user (leader, information, logistics, political affairs Work etc.), which rank of post and rank user (commandant, army commander, teacher, regimental commander, company commander etc.), multiple dimensions such as specific personnel selection define Virtual Battlefield permission.
User's dummy is for defining holographic three-dimensional portrait characterising parameter, including portrait profile, facial image feature, hand The elements such as portion's image feature, predecessor's image feature, rear body image feature, figure action capture.
Virtual Battlefield is merged with portrait for defining the collected holographic three-dimensional portrait of multiple commandings for being in strange land Projection in virtual battlefield environment, the parameters such as erect-position according to personnel away from acquisition camera, to determine in virtual battlefield environment The direction of middle holographic portrait, while distributing according to personnel amount in virtual battlefield environment the launching position of portrait, it is ensured that portrait Between two-by-two face-to-face.
Multi-user collaborative model defines the cooperative mode between the collaboration between being in two or more commanders of strange land, including Each commander, which can synchronize, sees that Virtual Battlefield commands environment, but because the difference of position causes the Virtual Battlefield seen to refer to It is also different to wave environmental perspective;Other people holographic portrait can be seen in each commander, and it is current to perceive other people by portrait The variation of coherent movement, facial characteristics, posture, position, as cooperateed with face-to-face in same place;It can between each commander Hear other people voice, and the voice of each commander and current action, facial characteristics, posture synchronous transfer;Each finger The human-computer interaction overall process that the person of waving and target carry out, can be synchronized by all commanders and see;Two or more commanders can assist Serial operation is carried out with to target, and interaction overall process can be synchronized by all commanders and see.
Interactive instruction model defines the interactive mode of target in commanding and virtual scene, and commander is around virtual war Field commander environment periphery is mobile, can command environment from different view Virtual Battlefields;Commander is respectively further from, walks close to virtually Battleficld command environment, it can be seen that Virtual Battlefield commands environment to show variation also with distance;Commander by finger ring or The ancillary equipments such as finger ring bar select a certain target (being directed toward target, target response indicates selected), in conjunction with phonetic order or to auxiliary It helps the operation of equipment to realize to recall target interactive menu, selection and click menu item, or to the command system page function in window Face such as is selected, is clicked, being double-clicked, being pulled, being pulled down at the operation;In Virtual Battlefield commander's ambient Property load window, it is seen that add The local video shown, plain text, picture are carried, and video can be stopped, being suspended, played, mute control;It can fight In information source impression window, it is seen that draw the target detecting information connect, information collects machining information, uploads and issues in document information etc. Hold.
Battlefield dynamic object model defines all models for having interaction capabilities in virtual battlefield environment, outer including model It is dynamic after seeing pattern, the instruction set that the rendered color of model, the lighting effect of model, model are able to respond, model response instruction Make (change, the change of color etc. that the variation of movement, tilt angle including position, content are shown).
Battlefield static object model defines the model of all no interactions abilities in virtual battlefield environment, battlefield terrain landforms Model defines all models relevant to battlefield surroundings, these models all include the rendering face of the appearance style of model, model The contents such as lighting effect of color, model.
How again object feedback content model defines after target receives interactive instruction and trigger change of target state, structure It makes object module and feeds back to commanding.
(2) composed structure
The system composed structure of this patent is as shown in Fig. 2, specifically including user authority management, the access of battlefield information source, accusing System integration frame, exterior content load, Virtual Battlefield spatial modeling, Virtual Battlefield space show, cooperation interaction common support, People-shadow picture collaboration, people-target cooperative, people-ten parts of target interaction composition.
User authority management includes the functions such as virtual conference room management, personal management, rights management, collaborative invitation.Pass through Personal management maintenance institute virtual scene to be added in participate in cooperate with user information, can be created by promoter by room management Collaboration spaces in one virtual scene can select personnel that collaboration spaces are added, pass through permission by collaborative invitation by promoter Everyone collaboration permission is managed in settable collaboration spaces, including whose image I can see and link up therewith, I can see Which object in virtual scene, I can interact which object.
Information source access in battlefield, which is provided, connects support, energy to drawing for land, sea, air, outer space plurality of information resources under fixed environment, tactical environment After information source progress Machining Analysis will enough be got in real time, the battlefield situation information of support can be provided for collaboration commander by being converted into.
The offer of Combat Command System integrated framework integrates Combat Command System interface level, supports desktop client application interface, Web Interface it is integrated, and combine people-target interactive function, realize in Virtual Battlefield space commanding to Combat Command System into choosing Operation, the real-time updating system feedback result such as select, click, double-clicking, pulling, pulling down.
Exterior content load refers to by opening properties window in Virtual Battlefield space designated position, provides and regards to outside Frequently, the load of the multiple formats content such as picture, text, PPT, Word, Excel is shown, the load to video flowing is also supported to show.
Virtual Battlefield spatial modeling refers to combining geographic information system and modeling technique, realizes and goes back to the true to nature of battlefield surroundings Original, comparing existing GIS-Geographic Information System, there is immersion to feel strong, real scene reduction degree height, can check battlefield surroundings from different perspectives And enemy and we's target state.
Virtual Battlefield space representation provides hawkeye, roaming, particular task region are cut/cut out, target amplification/reduction Etc. a variety of browsing modes, commanding is facilitated to check battlefield space from different perspectives.
Cooperation interaction common support provides the functions such as clock is synchronous, coordinate calculates, instruction collaboration, state collaboration.Clock Synchronize between collaboration user, user and content cooperate with etc. provide when system support;Coordinate is calculated to hologram three-dimensional portrait virtual Coordinate setting in space provides basic algorithm support;Instruction collaboration is that the operational order of user gesture conversion is set to each glasses Back-up hair provides support;State collaboration provides branch to the distribution of each glasses device for the dbjective state to change in virtual scene Support.
The functions such as people-shadow picture collaboration provides character features extraction, the fusion of multichannel portrait, voice de-noising, phonotape and videotape synchronize. Character features extraction is by one group of depth camera from multiple angle acquisition portraits;The fusion of multichannel portrait is adopted to different angle The portrait of collection extracts the coherent videos parameters such as characteristic value, color, posture, figure, to form complete portrait ginseng by splicing and combining Number;Voice de-noising is to carry out noise reduction process to voice;Phonotape and videotape, which synchronizes, to be synchronized to the portrait and voice of synchronization acquisition Processing, is distributed to all glasses devices, it is ensured that everyone sees other people while can also listen under the premise of uniform time reference To the voice for hearing other side.
People-target cooperative provides dbjective state feedback, dbjective state acquires, dbjective state is distributed, dbjective state is synchronous Etc. functions.Dbjective state feedback is that object module makes different reactions according to interactive instruction, drives target position, shape, face Color, state, content also change correspondingly;Dbjective state acquisition is to obtain the change of dbjective state, extracts the parameter specifically changed Value;Dbjective state distribution is referred to the change of dbjective state being distributed to proprietary augmented reality equipment to initiate artificial reference; It is the change progress that will be distributed to the dbjective state and local dbjective state of commanding's augmented reality equipment that dbjective state, which synchronizes, It compares and verifies, it is as inconsistent in found, ensure own after being handled by the cooperative client being deployed in augmented reality equipment The consistency that dbjective state is shown in augmented reality equipment.
People-target interaction gesture feature extraction, gesture motion analysis, interactive instruction conversion, the positioning of direction target, gesture refer to Enable the functions such as acquisition, phonetic order acquisition, command synchronization distribution, instruction action readjustment.Gesture feature extraction is from ancillary equipment Perceive the data such as direction, the relative position of the current gesture of commanding;Gesture motion analysis is when extracting one section from ancillary equipment The sequence of interior gesture;Interactive instruction conversion is analyzed gesture sequence, dynamic with gesture predetermined It is compared and is converted to operational order;Being directed toward target positioning is according to data such as gestures direction, relative positions, in conjunction with enhancing The coordinate-system of reality scene differentiates targeted coordinate, position;Gesture instruction/phonetic order acquisition is when user is with hand When gesture, voice issue instruction, triggering command interface call operation is obtained currently according to the instruction after gesture/speech recognition conversion; Command synchronization distribution is by interactive instruction synchronization distribution to other augmented reality equipment;Instruction action readjustment is after receiving instruction Interaction channel is established with operation target and triggers the callback method of target, so that target executes feedback action.
Expansion illustrates suitable environment, internal relations and calling logic, the implementation process of this patent, following institute further below Show:
(1) to the requirement of augmented reality equipment
(2) internal relations and calling logic
The internal relations of this patent are with calling logic as shown in figure 3, around user authority management, the access of battlefield information source, referring to Control system integration frame, exterior content load, Virtual Battlefield spatial modeling, Virtual Battlefield space shows, cooperation interaction shares branch Its internal relations and calling logic are introduced in support, people-shadow picture collaboration, people-target cooperative, people-target interaction.
A) user authority management
Commanding needs to carry out login authentication by modes such as fingerprint, passwords, only before access of virtual battlefield surroundings It just can enter Virtual Battlefield space by the user of verifying, and use correlation function.
B) battlefield information source accesses
Interface is externally provided and obtains the information sources such as land/sea/sky/day in a manner of actively or passively, is inputted after processed arrangement Virtual battlefield environment, after processing with target bind, or directly in information source window concentrate show.
C) Combat Command System integrated framework
The Services Integrations functions such as service registration, addressing, proxy access are provided, my army's active service Combat Command System correlation function can be integrated Can, system demonstration window integration exhibition Combat Command System function pages are provided, and people-target interaction capabilities is combined to realize to system function Can interface the operation such as click, double-click, pulling, choosing.
D) exterior content loads
It provides local storage space to be used to receive the files such as external video, text, picture, PPT, Word, Excel, periodically File in poll local storage space and the load refreshing in the properties window of virtual battlefield environment, and people-target is combined to hand over Mutual ability such as realizes page turning to content, clicks at the operation.
E) Virtual Battlefield spatial modeling
The models such as all kinds of targets, landform, vegetation, trees are built by modeling process, based on augmented reality equipment load fortune After row, global virtual battlefield environment is generated.
F) Virtual Battlefield space shows
Realization the roaming of global virtual battlefield environment and hawkeye are checked, can also for partial model amplify reduction, The operation such as cut out, the feedback of all these operations all can be presented to commanding by augmented reality equipment.
G) cooperation interaction common support
Construct the instruction transmission channel in gesture ancillary equipment and virtual battlefield environment between all kinds of dynamic objects, get through people with Interaction channel between target;There is provided unified clock, it is ensured that more synchronizing for human world collaborative content are synchronized one's steps;Real world is provided Coordinate and virtual battlefield environment coordinate are converted, and the interaction for the collaboration of more human world, people and target provides location-based service;Offer state Cooperation with service, it is ensured that dbjective state can be sent to all commanding's augmented reality equipment in Virtual Battlefield space simultaneously.
H) people-shadow picture collaboration
It is interacted by acquiring equipment with holographic portrait, hologram three-dimensional portrait is generated after rendered, by the portrait harmony of acquisition Sound is further processed, and the portrait after processing is sent to augmented reality equipment via streaming media server, through cooperation interaction The reality enhancing equipment of other commandings is sent to after common support function treatment by network and communications service link, thus real The holographic portrait of existing multiple augmented reality equipment rooms interacts collaboration.
I) people-target interaction
User speech is acquired by audio collecting device, user gesture is acquired by gesture ancillary equipment, it is bright after analyzing Really selected target forms and instructs to target control, sends an instruction to specified target by cooperation interaction common support function, Its feedback action is triggered after intended recipient instruction.
I) people-target cooperative
By cooperation interaction common support function, the features such as position, shape, color after intended recipient is instructed and fed back Change, synchronization distribution gives the augmented reality equipment of all commandings.
(3) implementation process
The implementation process of this patent is as shown in Figure 4.Entirely the cooperation interaction process based on virtual battlefield environment is as follows:
A) each class model in virtual battlefield environment is constructed, is generated after load operating in augmented reality equipment after building Virtual battlefield environment;
B) commanding needs to carry out login authentication before wearing augmented reality equipment, just can enter after being verified virtual Battlefield surroundings;
C) after being verified, the body image data that human image collecting equipment obtains also enter virtual battlefield environment, through feature Processing, advanced treating, Color Picking, portrait modeling etc. be after a series of processing, formed hologram three-dimensional portrait and with Virtual Battlefield ring Border fusion;
It e), can be same by each commanding's expression, movement, position, the change of limbs by people-shadow as synergistic function It walks in the augmented reality equipment of other commandings, and refreshes hologram three-dimensional portrait and show, to make owner that can perceive Image to other staff changes;
F) commanding also roaming, hawkeye can be used to check battlefield overall picture in virtual battlefield environment, it is possible to use specific The functions such as mission area incision, target amplification rotation closely check for specific region or target;
G) commanding puts on gesture ancillary equipment, may point to target in virtual battlefield environment and forms certain gestures, Gesture perceptual signal is transmitted to people-target interactive function through ancillary equipment, the operation of mapping is obtained after analyzing gesture Instruction, is sent to direction target;Or phonetic order is said on the basis of targeted, through the collected language of audio collecting device Sound signal is transmitted to people-target interactive function, and the operational order of mapping is obtained after matching to voice, is sent to direction mesh Mark;
H) after target receives operational order, the instruction feedback code of target internal is triggered, and then makes the appearance of target, position It sets, state etc. changes;
I) people-target interactive function collects the change information of target, and synchronization distribution is respectively commanded into virtual battlefield environment Personnel's augmented reality equipment, refreshes target in Virtual Battlefield space, so that owner can see the change of target;
J) it after all kinds of information sources are linked into virtual battlefield environment, is bound after processing with specific objective, or in information source window Middle concentration shows, and refreshes with people-target interaction;
K) all kinds of Combat Command System functions are integrated into function window, and the operation of system function is carried out with people-target interaction;
L) all kinds of exterior content resources bind with specific objective after processing, or concentrate and show in information source window, with People-target is interactive and refreshes.

Claims (10)

1. military field based on augmented reality multiterminal fusion collaboration command system, which is characterized in that including user authority management, The access of battlefield information source, the load of Combat Command System integrated framework, exterior content, Virtual Battlefield spatial modeling, Virtual Battlefield space show, Cooperation interaction common support, people-shadow picture collaboration, people-target cooperative, people-ten parts of target interaction, the user authority management Including room management, personal management, rights management, collaborative invitation function, pass through personal management maintenance institute void to be added The user information that collaboration is participated in quasi- scene can be created the collaboration in a virtual scene by the room management by promoter Space can select personnel that collaboration spaces are added, pass through the settable association of the rights management by the collaborative invitation by promoter In isospace everyone collaboration with interact permission, commanding needs before access of virtual battlefield surroundings through fingerprint, password Mode carries out login authentication, only just can enter Virtual Battlefield space by the user of verifying, and use correlation function;The war Field information source access, which is provided, draws land, sea, air, outer space plurality of information resources under fixed environment, tactical environment in a manner of actively or passively Support is connect, after capable of carrying out Machining Analysis for information source is got in real time, the battlefield of support can be provided for collaboration commander by being converted into Situation information and target bind, or concentrate and show directly in information source window;The Combat Command System integrated framework is provided to charge Integrated, support Web class Combat Command System application/RPC, gRPC, Webservice, Restful class charge function services of system It is integrated, and in conjunction with the people-target interactive function, realize that commanding grasps Combat Command System in Virtual Battlefield space Make, real-time updating system feedback result;The exterior content load refers to by Virtual Battlefield space designated position is opened Hold window, the file in periodic polling local storage space simultaneously loads refreshing in the properties window of virtual battlefield environment, and ties It closes the people-target interaction capabilities and realizes that page turning to content, point selection operation, the load for providing multiple format content are shown, The load to video flowing is supported to show;The Virtual Battlefield spatial modeling refers to combining geographic information system and modeling technique, leads to It crosses modeling process and builds all kinds of targets, landform, vegetation, tree modelling, it is global based on generating after augmented reality equipment load operating Virtual battlefield environment realizes the reduction true to nature to battlefield surroundings, can check battlefield surroundings and enemy and we's target movement shape from different perspectives State;The Virtual Battlefield space representation provides a variety of browsing modes, and the feedback of all these browsing modes can all pass through enhancing Real world devices are presented to commanding, and commanding is facilitated to check battlefield space from different perspectives;The cooperation interaction shares branch Support provides synchronous clock, coordinate calculating, instruction collaboration, state synergistic function, and the clock is synchronized between collaboration, user user Support that the coordinate calculates the coordinate setting to hologram three-dimensional portrait in Virtual Space and provides with system when cooperateing with offer of content Basic algorithm support, described instruction collaboration provide support to the distribution of each glasses device for the operational order that user gesture converts, The state collaboration provides support to the distribution of each glasses device for the dbjective state to change in virtual scene;The people- The collaboration of shadow picture provides character features extraction, the fusion of multichannel portrait, voice de-noising, audio-video synchronizing function, the character features Extraction is by one group of depth camera from multiple angle acquisition portraits, and the multichannel portrait fusion is acquired to different angle Portrait extracts coherent video parameter, to form complete hologram three-dimensional portrait parameter by splicing and combining, the voice de-noising is pair Voice carries out noise reduction process, and it is to synchronize processing to the portrait and voice of synchronization acquisition that the phonotape and videotape, which synchronizes, unified All glasses devices are distributed under the premise of time reference, it is ensured that everyone sees other people while can also hear other side's Voice;People-the target cooperative provides dbjective state feedback, dbjective state acquires, dbjective state is distributed, dbjective state is synchronous Function, dbjective state feedback are that object module makes different reactions according to interactive instruction, driving target position, shape, Color, state, content also change correspondingly, and the dbjective state acquisition is to obtain the change of dbjective state, extract specific variation Parameter value, dbjective state distribution is that the change of dbjective state is distributed to proprietary enhancing to initiate artificial reference Real world devices, it is the dbjective state and local target-like that will be distributed to commanding's augmented reality equipment that the dbjective state, which synchronizes, The change of state is compared and verifies, as inconsistent in found, is carried out by the cooperative client being deployed in augmented reality equipment Ensure the consistency that dbjective state is shown in all augmented reality equipment after reason;The people-target interaction provides gesture feature and mentions It takes, gesture motion analysis, interactive instruction conversion, be directed toward target positioning, gesture instruction obtains, phonetic order obtains, command synchronization Distribution, instruction action callback function, the gesture feature extract be from the direction of the ancillary equipment perception current gesture of commanding, Station-keeping data, the gesture motion analysis is that the sequence of the gesture in a period of time is extracted from ancillary equipment, described Interactive instruction conversion is analyzed gesture sequence, is compared with gesture motion predetermined and is converted to operation Instruction, the direction target positioning is to be sentenced according to gestures direction, station-keeping data in conjunction with the coordinate-system of augmented reality scene Not targeted coordinate, position, the gesture instruction obtains and phonetic order acquisition is when user is with gesture or language When sound issues instruction, triggering command interface call operation is obtained currently according to the instruction after gesture or speech recognition conversion, described Command synchronization distribution is by interactive instruction synchronization distribution to other augmented reality equipment, and described instruction movement readjustment is to receive finger Interaction channel is established with operation target after order and triggers the callback method of target, so that target executes feedback action.
2. multiterminal fusion collaboration command system of the military field according to claim 1 based on augmented reality, feature exist In can the collaboration permission include the restriction in user access virtual battlefield, see other users image to user and assist therewith Restriction together, the restriction whether user can be seen by specified target and interacted.
3. multiterminal fusion collaboration command system of the military field according to claim 1 based on augmented reality, feature exist In the battlefield information source access includes drawing to connect all kinds of battlefield information sources of my army's active service, and can be shown in many ways.
4. multiterminal fusion collaboration command system of the military field according to claim 1 based on augmented reality, feature exist In the Combat Command System integrated framework is provided to Web class Combat Command System application/RPC, gRPC, Webservice, Restful class It accuses the integrated of function services, and commanding is supported to grasp in virtual battlefield environment to integrated Combat Command System function Make.
5. multiterminal fusion collaboration command system of the military field according to claim 1 based on augmented reality, feature exist In, exterior content load provide the multiple format content that load is shown include external video, picture, text, PPT, Word, Excel, and can show that content carries out the page turning of basic content, point selection operation to load.
6. multiterminal fusion collaboration command system of the military field according to claim 1 based on augmented reality, feature exist In the global virtual battlefield environment of the Virtual Battlefield spatial modeling creation realizes the reduction true to nature to battlefield surroundings.
7. multiterminal fusion collaboration command system of the military field according to claim 1 based on augmented reality, feature exist In a variety of browsing modes that the Virtual Battlefield space representation provides include hawkeye, roaming, the incision of particular task region and cut Out, target amplification and reduction.
8. multiterminal fusion collaboration command system of the military field according to claim 1 based on augmented reality, feature exist In the people-people's target interaction is directed toward target gesture to user and analyzes and perceive, and realizes to target in virtual battlefield environment Accurate positioning.
9. multiterminal fusion collaboration command system of the military field according to claim 1 based on augmented reality, feature exist In it is concurrent to be converted to manipulation instruction on the basis of realizing to targeting accuracy positioning by the people-people's target interaction for user gesture The target selected to user is sent, and triggers object feedback movement.
10. multiterminal fusion collaboration command system of the military field according to claim 1 based on augmented reality, feature exist In the people-people's target interaction realizes that the object feedback triggered by manipulation instruction movement can be by fingers all in virtual battlefield environment The synchronous perception of the person of waving.
CN201910586319.XA 2019-07-01 2019-07-01 Multi-terminal fusion cooperative command system based on augmented reality in military field Active CN110365666B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910586319.XA CN110365666B (en) 2019-07-01 2019-07-01 Multi-terminal fusion cooperative command system based on augmented reality in military field

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910586319.XA CN110365666B (en) 2019-07-01 2019-07-01 Multi-terminal fusion cooperative command system based on augmented reality in military field

Publications (2)

Publication Number Publication Date
CN110365666A true CN110365666A (en) 2019-10-22
CN110365666B CN110365666B (en) 2021-09-14

Family

ID=68217624

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910586319.XA Active CN110365666B (en) 2019-07-01 2019-07-01 Multi-terminal fusion cooperative command system based on augmented reality in military field

Country Status (1)

Country Link
CN (1) CN110365666B (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110989842A (en) * 2019-12-06 2020-04-10 国网浙江省电力有限公司培训中心 Training method and system based on virtual reality and electronic equipment
CN111274910A (en) * 2020-01-16 2020-06-12 腾讯科技(深圳)有限公司 Scene interaction method and device and electronic equipment
CN111467789A (en) * 2020-04-30 2020-07-31 厦门潭宏信息科技有限公司 Mixed reality interaction system based on Holo L ens
CN111651057A (en) * 2020-06-11 2020-09-11 浙江商汤科技开发有限公司 Data display method and device, electronic equipment and storage medium
CN112232172A (en) * 2020-10-12 2021-01-15 上海大学 Multi-person cooperation simulation system for electronic warfare equipment
CN113037616A (en) * 2021-03-31 2021-06-25 中国工商银行股份有限公司 Interactive method and device for cooperatively controlling multiple robots
CN113254641A (en) * 2021-05-27 2021-08-13 中国电子科技集团公司第十五研究所 Information data fusion method and device
CN114579023A (en) * 2021-12-13 2022-06-03 北京市建筑设计研究院有限公司 Modeling method and device and electronic equipment
CN115439635A (en) * 2022-06-30 2022-12-06 亮风台(上海)信息科技有限公司 Method and equipment for presenting mark information of target object
CN115808974A (en) * 2022-07-29 2023-03-17 深圳职业技术学院 Immersive command center construction method and system and storage medium
CN115826763A (en) * 2023-01-09 2023-03-21 南京宇天智云仿真技术有限公司 Special combat simulation system and method based on virtual reality

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6215498B1 (en) * 1998-09-10 2001-04-10 Lionhearth Technologies, Inc. Virtual command post
CN101964019A (en) * 2010-09-10 2011-02-02 北京航空航天大学 Against behavior modeling simulation platform and method based on Agent technology
WO2013111146A2 (en) * 2011-12-14 2013-08-01 Virtual Logic Systems Private Ltd System and method of providing virtual human on human combat training operations
CN107545788A (en) * 2017-10-17 2018-01-05 北京华如科技股份有限公司 Goods electronic sand map system is deduced based on the operation that augmented reality is shown
CN108664121A (en) * 2018-03-31 2018-10-16 中国人民解放军海军航空大学 A kind of emulation combat system-of-systems drilling system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6215498B1 (en) * 1998-09-10 2001-04-10 Lionhearth Technologies, Inc. Virtual command post
CN101964019A (en) * 2010-09-10 2011-02-02 北京航空航天大学 Against behavior modeling simulation platform and method based on Agent technology
WO2013111146A2 (en) * 2011-12-14 2013-08-01 Virtual Logic Systems Private Ltd System and method of providing virtual human on human combat training operations
CN107545788A (en) * 2017-10-17 2018-01-05 北京华如科技股份有限公司 Goods electronic sand map system is deduced based on the operation that augmented reality is shown
CN108664121A (en) * 2018-03-31 2018-10-16 中国人民解放军海军航空大学 A kind of emulation combat system-of-systems drilling system

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
PETR FRANTIS: "Technical issues of Virtual Sand Table system", 《2017 INTERNATIONAL CONFERENCE ON MILITARY TECHNOLOGIES (ICMT)》 *
孟德地: "基于人工智能的战场环境目标分析系统研究", 《第六届中国指挥控制大会论文集(上册)》 *
杜思良: "信息系统中作战资源虚拟化应用技术研究", 《指挥与控制学报》 *

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110989842A (en) * 2019-12-06 2020-04-10 国网浙江省电力有限公司培训中心 Training method and system based on virtual reality and electronic equipment
CN111274910A (en) * 2020-01-16 2020-06-12 腾讯科技(深圳)有限公司 Scene interaction method and device and electronic equipment
CN111274910B (en) * 2020-01-16 2024-01-30 腾讯科技(深圳)有限公司 Scene interaction method and device and electronic equipment
US20220156986A1 (en) * 2020-01-16 2022-05-19 Tencent Technology (Shenzhen) Company Limited Scene interaction method and apparatus, electronic device, and computer storage medium
CN111467789A (en) * 2020-04-30 2020-07-31 厦门潭宏信息科技有限公司 Mixed reality interaction system based on Holo L ens
CN111651057A (en) * 2020-06-11 2020-09-11 浙江商汤科技开发有限公司 Data display method and device, electronic equipment and storage medium
CN112232172B (en) * 2020-10-12 2021-12-21 上海大学 Multi-person cooperation simulation system for electronic warfare equipment
CN112232172A (en) * 2020-10-12 2021-01-15 上海大学 Multi-person cooperation simulation system for electronic warfare equipment
CN113037616A (en) * 2021-03-31 2021-06-25 中国工商银行股份有限公司 Interactive method and device for cooperatively controlling multiple robots
CN113254641B (en) * 2021-05-27 2021-11-16 中国电子科技集团公司第十五研究所 Information data fusion method and device
CN113254641A (en) * 2021-05-27 2021-08-13 中国电子科技集团公司第十五研究所 Information data fusion method and device
CN114579023A (en) * 2021-12-13 2022-06-03 北京市建筑设计研究院有限公司 Modeling method and device and electronic equipment
CN115439635A (en) * 2022-06-30 2022-12-06 亮风台(上海)信息科技有限公司 Method and equipment for presenting mark information of target object
CN115439635B (en) * 2022-06-30 2024-04-26 亮风台(上海)信息科技有限公司 Method and equipment for presenting marking information of target object
CN115808974A (en) * 2022-07-29 2023-03-17 深圳职业技术学院 Immersive command center construction method and system and storage medium
CN115808974B (en) * 2022-07-29 2023-08-29 深圳职业技术学院 Immersive command center construction method, immersive command center construction system and storage medium
CN115826763A (en) * 2023-01-09 2023-03-21 南京宇天智云仿真技术有限公司 Special combat simulation system and method based on virtual reality

Also Published As

Publication number Publication date
CN110365666B (en) 2021-09-14

Similar Documents

Publication Publication Date Title
CN110365666A (en) Multiterminal fusion collaboration command system of the military field based on augmented reality
US11915670B2 (en) Systems, methods, and media for displaying interactive augmented reality presentations
US11403595B2 (en) Devices and methods for creating a collaborative virtual session
Pavlik Journalism in the age of virtual reality: How experiential media are transforming news
Leigh et al. A review of tele-immersive applications in the CAVE research network
EP3595789B1 (en) Virtual reality system using an actor and director model
CN102625129A (en) Method for realizing remote reality three-dimensional virtual imitated scene interaction
CN107479705A (en) A kind of command post's work compound goods electronic sand map system based on HoloLens
US20210287416A1 (en) Methods and Systems for Creating an Immersive Character Interaction Experience
CN107463248A (en) A kind of remote interaction method caught based on dynamic with line holographic projections
CN104603807A (en) Mobile video conferencing with digital annotation
CN107463262A (en) A kind of multi-person synergy exchange method based on HoloLens
Tennent et al. Thresholds: Embedding virtual reality in the museum
Boddington The Internet of Bodies—alive, connected and collective: the virtual physical future of our bodies and our senses
CN102170361A (en) Virtual-reality-based network conference method
KR20200097637A (en) Simulation sandbox system
CN105938541A (en) System and method for enhancing live performances with digital content
CN111862711A (en) Entertainment and leisure learning device based on 5G internet of things virtual reality
WO2024060799A1 (en) Multi-device collaboration method, and client
DE102016116582A1 (en) Method and apparatus for displaying augmented reality
CN202551219U (en) Long-distance three-dimensional virtual simulation synthetic system
Lang The impact of video systems on architecture
Song Interpersonal Communication Research in Metaverse
Sun et al. Video Conference System in Mixed Reality Using a Hololens
Glahn et al. Satellite Arts: a television of attractions

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant