CN114612640A - Space-based situation simulation system based on mixed reality technology - Google Patents

Space-based situation simulation system based on mixed reality technology Download PDF

Info

Publication number
CN114612640A
CN114612640A CN202210297002.6A CN202210297002A CN114612640A CN 114612640 A CN114612640 A CN 114612640A CN 202210297002 A CN202210297002 A CN 202210297002A CN 114612640 A CN114612640 A CN 114612640A
Authority
CN
China
Prior art keywords
space
real
unit
holographic
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210297002.6A
Other languages
Chinese (zh)
Inventor
王宇翔
廖通逵
王帅
李晓明
徐仁杰
李示威
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aerospace Hongtu Information Technology Co Ltd
Original Assignee
Aerospace Hongtu Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aerospace Hongtu Information Technology Co Ltd filed Critical Aerospace Hongtu Information Technology Co Ltd
Priority to CN202210297002.6A priority Critical patent/CN114612640A/en
Publication of CN114612640A publication Critical patent/CN114612640A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2016Rotation, translation, scaling

Abstract

The application provides a space-based situation simulation system based on mixed reality technology relates to computer simulation technical field, the system includes: the virtual scene generation module is used for acquiring real-time measurement data of each space target and drawing all the space targets in the virtual scene according to the real-time measurement data; the space fusion module is used for mapping the actual physical space positions of all space targets in the virtual scene to generate a holographic image; the interaction module is used for acquiring various gestures and eye actions of the user captured by the MR equipment and realizing the interaction between the user actions and the holographic space; and the far-large-near-small display module is used for displaying the target in the holographic space in a far-large-near mode by calculating the actual distance between the user and the space target and utilizing a pre-obtained visual scaling coefficient. The method and the device can provide a vivid spatial situation analysis training environment for commanders at all levels, and improve the timeliness of task command efficiency and instructions.

Description

Space-based situation simulation system based on mixed reality technology
Technical Field
The application relates to the technical field of computer simulation, in particular to a space-based situation simulation system based on a mixed reality technology.
Background
Although the Mixed Reality (MR) technology breaks through the limitation caused by vision, a four-dimensional space is constructed in the human brain, and a real space is Mixed to display a three-dimensional virtual scene, so that a user can visually see a real three-dimensional interactive scene and perform data driving. At present, a space-based situation system realized by using a mixed reality technology is lacked.
Further, in the holographic space, the observation of an object by human vision is small and large. The human perception of the size of an object depends on the perspective of the object in the eye. If the same object is close to the human eye, the viewing angle is large, and the human feels that the object is large. If away from the human eye, the viewing angle is small and the object is perceived to be small. Therefore, for the space-based situation information, how to ensure that the user can see the information clearly under the conditions of not damaging the reduction degree of the space-based situation, the space reality and the real-time data driving performance and ensure the visual comfort level is not too large when the space target is in a short distance and too small when the space target is in a long distance, so that the user cannot see the space target. In a two-dimensional plane, the distance and the near can be realized through a perspective effect, but in a holographic space, a method for realizing the distance and the near is lacked at present.
Disclosure of Invention
In view of this, the present application provides a space-based situation simulation system based on a mixed reality technology, so as to solve the above technical problems in the prior art.
The embodiment of the application provides a space-based situation simulation system based on mixed reality technology, includes:
the virtual scene generation module is used for acquiring real-time measurement data of each space target and drawing all the space targets in the virtual scene according to the real-time measurement data;
the space fusion module is used for mapping the actual physical space positions of all space targets in the virtual scene to generate a holographic image;
the interaction module is used for acquiring various gestures and eye actions of the user captured by the MR equipment and realizing the interaction between the user actions and the holographic space;
and the far-large-near display module is used for displaying the target in the holographic space in the far-large-near mode by calculating the actual distance between the user and the space target and utilizing a pre-obtained visual scaling coefficient.
Further, the virtual scene generation module includes:
the real-time situation data communication analysis unit is used for acquiring real-time measurement data of each space target, and processing and analyzing the real-time measurement data;
the real-time situation data display unit is used for performing visual drawing and dynamic display on the space target by utilizing the real-time measurement data;
and the real-time situation data resolving and analyzing unit is used for performing space resolving on the change of the real-time measurement data and updating and displaying the space situation.
Further, the spatial fusion module includes:
the spatial position mapping unit is used for mapping a spatial target of the virtual scene to a fixed position in a physical space to generate a holographic image;
and the spatial perception unit is used for calculating the position of the holographic image in real time according to the real movement of the user.
Further, the interaction module comprises:
the eye tracking unit is used for tracking the fixation point of the user by collecting the positioning and the movement of the eyes under the condition that the user keeps the head-up state, so that the eyes of the user can interact with the holographic image;
the hand tracking unit is used for acquiring the position, rotation, size and action of a hand, and manually tracking, touching, grasping and moving the holographic image to enable the holographic image to make the same reaction as a real object;
and the voice and dictation unit is used for operating the holographic image according to the voice command and drawing the holographic image by using the voice content when receiving the voice command.
Further, the distance and size display module comprises:
the determining unit is used for determining an optimal visual scaling coefficient ScaleFactor;
the calculating unit is used for calculating the actual Distance between the user and the space target according to the real-time measurement data, and calculating the display Size value Size of the space target:
Size=Distance*ScaleFactor
and the drawing unit is used for drawing the space target in the holographic space according to the display Size value Size of the space target.
Further, the determining unit is specifically configured to:
selecting a plurality of visual scaling coefficients, and performing a plurality of tests based on the human body mechanical structure, the human field angle, the observation distance and the image character size to obtain a plurality of visual experiences;
and taking the visual scaling factor corresponding to the optimal visual experience as the optimal visual scaling factor scaleFactor.
Further, the system further comprises: and the multi-person collaborative interaction module is used for realizing collaborative interaction of multiple persons in the holographic space of the space-based situation.
Further, the multi-person collaborative interaction module includes:
the holographic image equipment synchronization unit is used for managing the accessed holographic image equipment, performing parameter configuration and realizing the drawing of holographic images in the same space;
the holographic image collaborative interaction unit is used for simultaneously operating the holographic images by using different MR equipment and supporting multiple operation modes of the different MR equipment;
the process recording and playback unit is used for realizing the functions of collecting, classifying and storing various kinds of thought deduction data and process comprehensive situations generated in the thought deduction process and simulating the thought deduction process playback;
the report generation and management unit is used for generating and managing scout satellite transit information, scout satellite threat degree evaluation information and safety time interval information of a certain area or an action route according to the track forecast calculation result and combining the threat degree of the satellite;
and the simulation engine service unit is used for establishing data simulation drive, test pilot adjustment and process control for the cooperative tasks by relying on a simulation engine and monitoring and managing the whole simulation process.
Further, the multi-person collaborative interaction module further includes:
the collaborative task creating unit is used for combing all processes and various key factors of task execution according to the analysis content of the task, and carrying out collaborative task planning and formulation according to task execution logic, combing software, hardware and data information;
the task scene editing unit is used for providing editing, modifying, storing and deleting to the existing information content according to the user authority for the established collaborative task scheme;
and the interactive state and authority management unit is used for displaying the object interactive state of the holographic space and managing and classifying the user access authority.
The method and the device can provide a vivid spatial situation analysis training environment for commanders at all levels, and improve the timeliness of task command efficiency and instructions.
Drawings
In order to more clearly illustrate the detailed description of the present application or the technical solutions in the prior art, the drawings needed to be used in the detailed description of the present application or the prior art description will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a functional structure diagram of a space-based situation simulation system based on a mixed reality technology according to an embodiment of the present application;
fig. 2 is a functional structure diagram of a virtual scene generation module according to an embodiment of the present disclosure;
fig. 3 is a functional structure diagram of a space fusion module provided in the embodiment of the present application;
fig. 4 is a functional structure diagram of an interaction module provided in the embodiment of the present application;
fig. 5 is a functional structure diagram of a distance and size display module according to an embodiment of the present disclosure;
fig. 6 is a functional structure diagram of a multi-user collaborative interaction module according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
First, technical terms related to the embodiments of the present application will be briefly described.
1. MR technique
Mixed Reality (MR) technology is an emerging development direction in visualization technology. The MR technology is actually an upgrade of the AR (Augmented Reality) technology, and can superimpose virtual information (objects, pictures, videos, sounds, system prompt information, etc.) generated by a computer onto a real scene and interact with a person, so as to synthesize the virtual world and the real world into a seamlessly joined virtual-real fusion world, wherein physical entities and digital objects satisfy a real three-dimensional projection relationship, and the person can not only freely move in a three-dimensional space and observe from various angles, but also interact with "entities" in the three-dimensional virtual space in real time, thereby realizing real three-dimensional display and real illusion interleaving.
The MR technology utilizes space scanning, three-dimensional construction and MR space anchoring technology to assist perception, breaks through limitation brought by vision, constructs a four-dimensional space in a brain, displays a three-dimensional virtual scene in a mixed mode in a real space, presents a real three-dimensional interactive scene through three-dimensional simulation display of various space entity information, drives virtual entities in the real world through data, constructs an immersive three-dimensional simulation system combining virtuality and reality, and provides a realistic battlefield space situation analysis training environment for commanders at all levels.
The space-based situation is a comprehensive description of the states and trends of elements such as running tracks, behavior events, space environments and the like of various spacecrafts, space debris and the like running in the space. The three-dimensional simulation technology can truly reproduce the environment and the situation, brings people with experience of being personally on the scene, and can provide display effects of visual animation, graphics and the like for space-based situation simulation display and command decision. Currently, the mainstream three-dimensional visualization technology is actually displayed on two-dimensional display equipment such as a television and a projector, but is not true information display of a three-dimensional space, and has insufficient display degree for some fine and stereoscopic satellite models, TK events and the like, so that a director is difficult to accurately understand the process of the spatial event.
The MR device generally employs an optical fluoroscopy technique, and superimposes a virtual image on the eyeball of a person; the method adopts a video perspective technology, acquires the real world seen by a user in real time through a binocular camera and digitalizes the real world, and renders pictures in real time through a computer algorithm, so that partial virtual images can be overlaid, the virtual images can be completely overlaid, the limitation of the real pictures can be broken away, the images can be deleted and changed, and the human eyes can see the new real pictures rendered by the computer. The depth camera is used for scanning the real space, the three-color laser emits three beams of laser, the three beams of laser are converged by the relay lens and then emit to the two micro mirrors, one fast mirror controls transverse scanning, one slow mirror controls longitudinal movement, space depth data is obtained, space redrawing is carried out, and an application scene in the system is placed at a corresponding real space model position.
MR is the blending of real world and virtual world to create a new visualization environment. The key word is 'redrawn', namely, the real world is redrawn, the virtual information is superposed, the virtual information can interact with the redrawn real world, partial reservation and free switching between virtual and real are realized, and the environment is enriched.
Unlike the PC side which interacts with a mouse and keyboard, the MR technology interacts by capturing hand movements. The HoloLens 2 is externally provided with four depth cameras, the motion of the hand is recognized through depth, motion capture is carried out, different gestures are preset through the motion of the hand, and different functions are started through the different gestures.
2. Display technology of space-based situation in holographic space
And reading the local data list through an IO system, analyzing the local data list into a structure with a user-defined format, storing the local data list in a memory in a classification mode, and then drawing the local data list in a virtual space in a node mode.
3. Real-time display and control technology for space-based situation
Position information of track data (be the individual space coordinate point in space target operation process), draw the line connection between every two nodes in the space, can form the track visual, in order to let track data more level and smooth, more be close to actual movement track, just need enough intensive coordinate point, but MR equipment memory has a threshold value, the memory of helmet can be exploded to too dense coordinate point, so need balance the relation between memory and the track smoothness, through the repetition test, obtain: in the case where data is generated at a coordinate point every two seconds, the connecting line between every 50 spatial coordinate points is the optimal balance point for the spatial target trajectory simulation.
4. Spatial processing and tracking techniques
The method comprises the steps of scanning a space anchor point by using one MR device, synchronously transmitting space anchor point data to other devices through a network, storing the space anchor point data for one time, recovering the space anchor point data from the stored data when the space anchor point is deviated and wrong, and correcting and synchronizing the space anchor points of all the devices, so that the coordinate origins of all the devices are at the same position in reality, namely all the devices are in the same world coordinate system, and only the coordinates need to be synchronized when virtual objects in the space are synchronized.
5. Three-dimensional interaction of space-based situation in holographic space
Through gesture recognition, carry out the hand position mapping of reality and virtual space, with the hand in the reality to the virtual space in, just so know the position relation of objects such as hand and UI, model in the virtual space, rethread detects finger position and state, can realize the click to UI and snatch to the object.
6. Distance and near size technology for holographic information
In the holographic space, spatial positioning and positional relationship are indispensable factors. Firstly, testing the basic Distance (Distance) between the glasses and the virtual object through continuous debugging, and finding the optimal visual scaling factor (scaleFactor) by combining the human body mechanical structure and the human visual field angle, observing the Distance and the character size. Secondly, the space positioning and the position relation give the inspiration to research and development personnel, the research and development personnel bind the display proportion and the distance of the space-based situation information and utilize the proportional relation: size, Distance, ScaleFactor (scaling factor); the farther the distance is, the larger the display information proportion is, and conversely, the closer the distance is, the smaller the display information proportion is, thereby realizing the effect of large and small distance. The distance between the camera and the target is calculated, and the distance and the size are in direct proportion by matching with a scaling coefficient, so that the effect of small and large distances is achieved.
After introducing the technical terms related to the present application, the design ideas of the embodiments of the present application will be briefly described below.
The application provides a space-based situation simulation system based on a mixed reality technology, which utilizes space scanning, three-dimensional construction and MR space anchoring technology to assist perception, breaks through the limitation caused by vision, constructs a four-dimensional space in the brain, displays a three-dimensional virtual scene in a mixed manner in a real space, presents a real three-dimensional interactive scene through three-dimensional simulation display of various space entity information, drives virtual entities in the real world by data, constructs an immersive three-dimensional simulation system combining virtual and real, and provides a vivid battlefield space situation analysis training environment for commanders at all levels. The interaction with a space-based situation system in a holographic space is realized, two-dimensional and three-dimensional stereo interaction is carried out, for example, operations such as situation scene switching, model zooming-in, model zooming-out, dragging movement, zooming-in and zooming-out, rotation and the like are carried out, and the interaction is carried out by using gestures.
The MR space-based situation system moves a traditional large screen to the front of a user, displays a combat command three-dimensional sand table, a space situation earth, a satellite model, related situations and the like in a three-dimensional presentation mode, breaks the limitation of space, realizes multi-person cooperative work, task command, situation deduction and the like in an undefined area, and can improve the task command efficiency and the timeliness of instructions.
After introducing the application scenario and the design concept of the embodiment of the present application, the following describes a technical solution provided by the embodiment of the present application.
As shown in fig. 1, an embodiment of the present application provides a space-based situation simulation system based on a mixed reality technology, where the system 100 includes:
the virtual scene generation module 101 is configured to obtain real-time measurement data of each spatial target, and draw all the spatial targets in a virtual scene according to the real-time measurement data;
the spatial fusion module 102 is configured to perform actual physical spatial position mapping on all spatial targets in the virtual scene to generate a holographic image;
the interaction module 103 is used for acquiring various gestures and eye actions of the user captured by the MR device and realizing interaction between the user actions and the holographic space;
the distance display module 104 is configured to calculate an actual distance between the user and the spatial object, and display the holographic spatial object in a distance display ratio by using a pre-obtained visual scaling factor.
In this embodiment, as shown in fig. 2, the virtual scene generating module 101 includes:
the real-time situation data communication analysis unit 201 is used for acquiring real-time measurement data of each space target, and processing and analyzing the real-time measurement data;
the real-time situation data display unit 202 is used for performing visual drawing and dynamic display on the space target by using the real-time measurement data;
and the real-time situation data calculating and analyzing unit 203 is used for performing space calculation on the change of the real-time measurement data and updating and displaying the space situation.
In this embodiment, as shown in fig. 3, the spatial fusion module 102 includes:
a spatial position mapping unit 301, configured to map a spatial target of a virtual scene to a fixed position in a physical space, and generate a hologram;
and the spatial perception unit 302 is used for calculating the position of the holographic image in real time according to the real movement of the user.
In this embodiment, as shown in fig. 4, the interaction module 103 includes:
the eye tracking unit 401 is used for tracking the fixation point of the user by collecting the positioning and movement of the eyes under the condition that the user keeps the head-up state, so that the eyes of the user can interact with the holographic image;
a hand tracking unit 402, configured to acquire a hand position, rotation, size, and motion, perform manual tracking, touching, grasping, and moving a hologram, so that the hologram can react as a real object;
and a voice and dictation unit 403, configured to interact with the hologram using a voice command, and draw a voice content using the hologram.
In this embodiment, as shown in fig. 5, the distance and size display module 104 includes:
the determining unit 501 is configured to select multiple visual scaling coefficients, perform multiple tests based on the human body mechanics structure, the human field angle, the observation distance, and the image character size, and obtain multiple visual experiences; taking the visual scaling coefficient corresponding to the optimal visual experience as an optimal visual scaling coefficient scaleFactor;
a calculating unit 502, configured to calculate an actual Distance between the user and the space target according to the real-time measurement data, and calculate a display Size value Size of the space target:
Size=Distance*ScaleFactor
among them, the "near-large-far-small" in the focus perspective is calculated from the vertical distance (shortest distance) from the observed object to the viewpoint, i.e., "for the same object, the distance from the eye is closer than the distance from the eye is seen", which is an objective fact in life and is also a basic principle in visual arts. However, in some cases (for example, in the aerospace visualization context), the method is not applicable, and the visualization effect is not good, so that an anti-focus perspective phenomenon is required, that is, the local existence of the distance and the magnitude is required, so that the operator can see the local details clearly, and the distance and the magnitude of the overall effect are not affected. This technique has been implemented in terms of visual difference in a two-dimensional plane, but has not been implemented in a holographic space.
A drawing unit 503, configured to draw the spatial object in the holographic space according to the display Size value Size of the spatial object.
Further, the system further comprises: and the multi-person collaborative interaction module 105 is used for realizing collaborative interaction of multiple persons in the holographic space of the space-based situation.
In this embodiment, as shown in fig. 6, the multi-person collaborative interaction module 105 includes:
a cooperative task creating unit 601, configured to comb all processes and various key factors of task execution according to the analysis content of the task, and plan and make a cooperative task according to information such as task execution logic, comb software, hardware, and data;
a task scene editing unit 602, configured to provide, according to a user right, editing, modifying, storing, and deleting of an existing information content for an established collaborative task scheme;
the interaction state and authority management unit 603 is configured to display an object interaction state of the holographic space, and manage and classify user access authority;
a holographic image device synchronization unit 604, configured to manage the accessed holographic image devices, perform parameter configuration, and correspond the holographic images to the real world, so as to implement holographic image rendering in the same space in other devices;
the holographic image collaborative interaction unit 605 is configured to use different devices to operate a holographic space at the same time, and support multiple operation modes of the different devices;
a process recording and playback unit 606 for collecting, classifying and storing various kinds of scenario deduction data and process comprehensive situations generated in the scenario deduction process, and simulating the scenario deduction process playback function, and providing support for analysis and evaluation based on simulation deduction data;
a report generation and management unit 607, configured to generate and manage reconnaissance satellite transit information, reconnaissance satellite threat degree assessment information, and safety period information of a certain area or a certain moving route according to the track forecast calculation result and by combining the threat degree of the satellite;
and the simulation engine service unit 608 is used for establishing data simulation drive, test pilot and debugging and process control for the cooperative task by relying on a simulation engine, and monitoring and managing the whole simulation process.
While the preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all alterations and modifications as fall within the scope of the application.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.

Claims (9)

1. A mixed reality technology-based space-based situation simulation system is characterized by comprising:
the virtual scene generation module is used for acquiring real-time measurement data of each space target and drawing all the space targets in the virtual scene according to the real-time measurement data;
the space fusion module is used for mapping the actual physical space positions of all space targets in the virtual scene to generate a holographic image;
the interaction module is used for acquiring various gestures and eye actions of the user captured by the MR equipment and realizing the interaction between the user actions and the holographic space;
and the far-large-near display module is used for displaying the target in the holographic space in the far-large-near mode by calculating the actual distance between the user and the space target and utilizing a pre-obtained visual scaling coefficient.
2. The mixed reality technology-based space-based situation simulation system of claim 1, wherein the virtual scene generation module comprises:
the real-time situation data communication analysis unit is used for acquiring real-time measurement data of each space target, and processing and analyzing the real-time measurement data;
the real-time situation data display unit is used for performing visual drawing and dynamic display on the space target by utilizing the real-time measurement data;
and the real-time situation data resolving and analyzing unit is used for performing space resolving on the change of the real-time measurement data and updating and displaying the space situation.
3. The mixed reality technology-based space-based situation simulation system of claim 2, wherein the spatial fusion module comprises:
the spatial position mapping unit is used for mapping a spatial target of the virtual scene to a fixed position in a physical space to generate a holographic image;
and the spatial perception unit is used for calculating the position of the holographic image in real time according to the real movement of the user.
4. The mixed reality technology-based space-based situational simulation system of claim 3, wherein the interaction module comprises:
the eye tracking unit is used for tracking the fixation point of the user by collecting the positioning and the movement of the eyes under the condition that the user keeps the head-up state, so that the eyes of the user can interact with the holographic image;
the hand tracking unit is used for acquiring the position, rotation, size and action of a hand, and manually tracking, touching, grasping and moving the holographic image to enable the holographic image to make the same reaction as a real object;
and the voice and dictation unit is used for operating the holographic image according to the voice command and drawing the holographic image by using the voice content when receiving the voice command.
5. The mixed reality technology-based space-based situation simulation system of claim 1, wherein the distance display module comprises:
the determining unit is used for determining an optimal visual scaling factor ScaleFactor;
the calculating unit is used for calculating the actual Distance between the user and the space target according to the real-time measurement data, and calculating the display Size value Size of the space target:
Size=Distance*ScaleFactor
and the drawing unit is used for drawing the space target in the holographic space according to the display Size value Size of the space target.
6. The mixed reality technology-based space-based situation simulation system of claim 5, wherein the determining unit is specifically configured to:
selecting a plurality of visual scaling coefficients, and performing a plurality of tests based on the human body mechanical structure, the human field angle, the observation distance and the image character size to obtain a plurality of visual experiences;
and taking the visual scaling factor corresponding to the optimal visual experience as the optimal visual scaling factor scaleFactor.
7. The mixed reality technology-based space-based situational simulation system of claim 1, further comprising: and the multi-person collaborative interaction module is used for realizing collaborative interaction of multiple persons in the holographic space of the space-based situation.
8. The mixed reality technology-based space-based situation simulation system of claim 7, wherein the multi-person collaborative interaction module comprises:
the holographic image equipment synchronization unit is used for managing the accessed holographic image equipment, performing parameter configuration and realizing the drawing of holographic images in the same space;
the holographic image collaborative interaction unit is used for simultaneously operating the holographic images by using different MR equipment and supporting multiple operation modes of the different MR equipment;
the process recording and playback unit is used for realizing the functions of collecting, classifying and storing various kinds of thought deduction data and process comprehensive situations generated in the thought deduction process and simulating the thought deduction process playback;
the report generation and management unit is used for generating and managing scout satellite transit information, scout satellite threat degree evaluation information and safety time interval information of a certain area or an action route according to the track forecast calculation result and combining the threat degree of the satellite;
and the simulation engine service unit is used for establishing data simulation drive, test pilot and debugging and process control for the cooperative task by relying on the simulation engine and monitoring and managing the whole simulation process.
9. The mixed reality technology-based space-based situation simulation system of claim 8, wherein the multi-person collaborative interaction module further comprises:
the collaborative task creating unit is used for combing all processes and various key factors of task execution according to the analysis content of the task, and carrying out collaborative task planning and formulation according to task execution logic, combing software, hardware and data information;
the task scene editing unit is used for providing editing, modifying, storing and deleting to the existing information content according to the user authority for the established collaborative task scheme;
and the interactive state and authority management unit is used for displaying the object interactive state of the holographic space and managing and classifying the user access authority.
CN202210297002.6A 2022-03-24 2022-03-24 Space-based situation simulation system based on mixed reality technology Pending CN114612640A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210297002.6A CN114612640A (en) 2022-03-24 2022-03-24 Space-based situation simulation system based on mixed reality technology

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210297002.6A CN114612640A (en) 2022-03-24 2022-03-24 Space-based situation simulation system based on mixed reality technology

Publications (1)

Publication Number Publication Date
CN114612640A true CN114612640A (en) 2022-06-10

Family

ID=81863968

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210297002.6A Pending CN114612640A (en) 2022-03-24 2022-03-24 Space-based situation simulation system based on mixed reality technology

Country Status (1)

Country Link
CN (1) CN114612640A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115100326A (en) * 2022-08-23 2022-09-23 北京东方融创信息技术有限公司 Digital space visualization system and method
CN115470710A (en) * 2022-09-26 2022-12-13 北京鼎成智造科技有限公司 Air game simulation method and device
CN115544684A (en) * 2022-10-07 2022-12-30 北京工业大学 FEA-MR-based in-situ real-time stress simulation method for clamped beams at two ends
CN115695762A (en) * 2022-12-30 2023-02-03 魔瞳(北京)科技有限公司 3D imaging multi-screen interaction method and system
CN115808974A (en) * 2022-07-29 2023-03-17 深圳职业技术学院 Immersive command center construction method and system and storage medium
CN117130492A (en) * 2023-10-27 2023-11-28 中国电子科技集团公司第十五研究所 Training situation visualization system based on mixed implementation and implementation method thereof

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115808974A (en) * 2022-07-29 2023-03-17 深圳职业技术学院 Immersive command center construction method and system and storage medium
CN115808974B (en) * 2022-07-29 2023-08-29 深圳职业技术学院 Immersive command center construction method, immersive command center construction system and storage medium
CN115100326A (en) * 2022-08-23 2022-09-23 北京东方融创信息技术有限公司 Digital space visualization system and method
CN115100326B (en) * 2022-08-23 2022-12-06 北京东方融创信息技术有限公司 Digital space visualization system and method
CN115470710A (en) * 2022-09-26 2022-12-13 北京鼎成智造科技有限公司 Air game simulation method and device
CN115544684A (en) * 2022-10-07 2022-12-30 北京工业大学 FEA-MR-based in-situ real-time stress simulation method for clamped beams at two ends
CN115544684B (en) * 2022-10-07 2023-08-18 北京工业大学 FEA-MR-based two-end clamped beam in-situ real-time stress simulation method
CN115695762A (en) * 2022-12-30 2023-02-03 魔瞳(北京)科技有限公司 3D imaging multi-screen interaction method and system
CN115695762B (en) * 2022-12-30 2023-03-17 魔瞳(北京)科技有限公司 3D imaging multi-screen interaction method and system
CN117130492A (en) * 2023-10-27 2023-11-28 中国电子科技集团公司第十五研究所 Training situation visualization system based on mixed implementation and implementation method thereof
CN117130492B (en) * 2023-10-27 2024-01-23 中国电子科技集团公司第十五研究所 Training situation visualization system based on mixed implementation and implementation method thereof

Similar Documents

Publication Publication Date Title
CN114612640A (en) Space-based situation simulation system based on mixed reality technology
US9210413B2 (en) System worn by a moving user for fully augmenting reality by anchoring virtual objects
Billinghurst et al. Shared space: An augmented reality approach for computer supported collaborative work
US20040233192A1 (en) Focally-controlled imaging system and method
EP2919093A1 (en) Method, system, and computer for identifying object in augmented reality
US11562598B2 (en) Spatially consistent representation of hand motion
US20200311396A1 (en) Spatially consistent representation of hand motion
Veas et al. Extended overview techniques for outdoor augmented reality
WO2013171731A1 (en) A system worn by a moving user for fully augmenting reality by anchoring virtual objects
US20220375358A1 (en) Class system, viewing terminal, information processing method, and program
US11442685B2 (en) Remote interaction via bi-directional mixed-reality telepresence
CN108830944B (en) Optical perspective three-dimensional near-to-eye display system and display method
Camba et al. From reality to augmented reality: Rapid strategies for developing marker-based AR content using image capturing and authoring tools
CN113918021A (en) 3D initiative stereo can interactive immersive virtual reality all-in-one
JP2018007180A (en) Image display device, image display method and image display program
CN111651043B (en) Augmented reality system supporting customized multi-channel interaction
Cho et al. Evaluating dynamic-adjustment of stereo view parameters in a multi-scale virtual environment
JPH0831140B2 (en) High-speed image generation and display method
Nithva et al. Efficacious Opportunities and Implications of Virtual Reality Features and Techniques
CN112286355B (en) Interactive method and system for immersive content
US20240078767A1 (en) Information processing apparatus and information processing method
US20230360336A1 (en) Collaborative mixed-reality system for immersive surgical telementoring
CN117014566A (en) Remote guidance method and system based on mixed reality technology
Nováková et al. Methodical procedure for creating content for interactive augmented reality
Asiri et al. The Effectiveness of Mixed Reality Environment-Based Hand Gestures in Distributed Collaboration

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination