CN114063465A - Distributed countermeasure simulation system visual jitter elimination method and visual nodes - Google Patents

Distributed countermeasure simulation system visual jitter elimination method and visual nodes Download PDF

Info

Publication number
CN114063465A
CN114063465A CN202111106674.6A CN202111106674A CN114063465A CN 114063465 A CN114063465 A CN 114063465A CN 202111106674 A CN202111106674 A CN 202111106674A CN 114063465 A CN114063465 A CN 114063465A
Authority
CN
China
Prior art keywords
entity
position data
next frame
visual
simulation system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111106674.6A
Other languages
Chinese (zh)
Other versions
CN114063465B (en
Inventor
梅红
吉永岗
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
AVIC First Aircraft Institute
Original Assignee
AVIC First Aircraft Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by AVIC First Aircraft Institute filed Critical AVIC First Aircraft Institute
Priority to CN202111106674.6A priority Critical patent/CN114063465B/en
Publication of CN114063465A publication Critical patent/CN114063465A/en
Application granted granted Critical
Publication of CN114063465B publication Critical patent/CN114063465B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B17/00Systems involving the use of models or simulators of said systems
    • G05B17/02Systems involving the use of models or simulators of said systems electric

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Image Generation (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application belongs to the field of distributed interactive simulation systems, and particularly relates to a method for eliminating visual jitter of a distributed countermeasure simulation system and a visual node. The method comprises the following steps: the method comprises the following steps of firstly, continuously acquiring entity position data in a distributed countermeasure simulation system network, and locally storing the entity position data; step two, entity position prediction is carried out according to the locally stored entity position data to obtain entity position data in the next frame of visual picture; step three, before rendering the view picture of the next frame, judging whether entity position data obtained from the distributed countermeasure simulation system network is updated; if yes, rendering a next frame of visual picture according to the updated entity position data; if not, rendering the next frame of visual picture according to the predicted entity position data. The method and the device can enhance the real-time performance of the view simulation node data and solve the problem of view image jitter, and can be realized by optimizing a system software algorithm, so that the cost is low.

Description

Distributed countermeasure simulation system visual jitter elimination method and visual nodes
Technical Field
The application belongs to the field of distributed interactive simulation systems, and particularly relates to a method for eliminating visual jitter of a distributed countermeasure simulation system and a visual node.
Background
The distributed interactive simulation uses a computer network as a support, and interconnects various relatively independent simulators dispersed in different regions to form a large-scale comprehensive virtual environment with the synergistic effect of multiple participants. In a distributed countermeasure simulation system, a phenomenon of image jitter of a view simulation node is often encountered, and the phenomenon is often caused by insufficient real-time performance of entity motion behavior data required for rendering a view image due to network delay, data loss or mismatching of real-time performance of synchronous calculation of other nodes and the like. In the distributed interactive simulation system, the real-time performance of data interaction among all nodes is always the key for realizing the smooth operation of the whole system.
In the prior distributed simulation system, the real-time performance of the system is generally improved by improving hardware, but with the increase of the number of simulation nodes and the amount of interactive data, the cost of the whole system is increased by times due to the fact that the hardware is continuously improved to ensure the real-time performance of the system, and the economy is very poor.
Accordingly, a technical solution is desired to overcome or at least alleviate at least one of the above-mentioned drawbacks of the prior art.
Disclosure of Invention
The application aims to provide a method for eliminating visual jitter of a distributed countermeasure simulation system and a visual node, so as to solve at least one problem in the prior art.
The technical scheme of the application is as follows:
a first aspect of the present application provides a method for eliminating visual jitter of a distributed countermeasure simulation system, including:
the method comprises the following steps of firstly, continuously acquiring entity position data in a distributed countermeasure simulation system network, and locally storing the entity position data;
step two, entity position prediction is carried out according to the locally stored entity position data to obtain entity position data in the next frame of visual picture;
step three, before rendering the view picture of the next frame, judging whether entity position data obtained from the distributed countermeasure simulation system network is updated;
if yes, rendering a next frame of visual picture according to the updated entity position data;
if not, rendering the next frame of visual picture according to the predicted entity position data.
In at least one embodiment of the present application, in step two, an extrapolation algorithm is used for entity location prediction.
In at least one embodiment of the present application, in the second step, the performing entity location prediction according to the locally stored entity location data to obtain entity location data in a next frame of view picture includes:
setting a simulation step length T;
obtaining initial position coordinates (x) of an entity0,y0,z0) And an initial position vector
Figure RE-RE-GDA0003428787490000021
Obtaining the position coordinates (x) of the entity after the time nT has elapsedi,yi,zi) And a position vector
Figure RE-RE-GDA0003428787490000022
Calculating the displacement vector of the entity at the moment
Figure RE-RE-GDA0003428787490000023
And velocity vector
Figure RE-RE-GDA0003428787490000024
Calculating to obtain the position vector of the next frame entity
Figure RE-RE-GDA0003428787490000025
That is, the position coordinate of the next frame of the entity is (x)i+1,yi+1,zi+1)。
A second aspect of the present application provides a view node, comprising:
the system comprises an entity position data acquisition module, a data storage module and a data processing module, wherein the entity position data acquisition module is used for continuously acquiring entity position data in a distributed countermeasure simulation system network and locally storing the entity position data;
the entity position prediction module is used for carrying out entity position prediction according to locally stored entity position data to obtain entity position data in a next frame of visual picture;
the scene picture rendering module is used for judging whether entity position data obtained from the distributed countermeasure simulation system network is updated or not before rendering the next frame of scene picture;
if yes, rendering a next frame of visual picture according to the updated entity position data;
if not, rendering the next frame of visual picture according to the predicted entity position data.
In at least one embodiment of the present application, the entity location prediction module employs an extrapolation algorithm to perform entity location prediction.
In at least one embodiment of the present application, the entity location prediction module includes:
the step length setting unit is used for setting a simulation step length T;
a position coordinate acquiring unit for acquiring initial position coordinates (x) of the entity0,y0,z0) And an initial position vector
Figure RE-RE-GDA0003428787490000026
Obtaining the position coordinates (x) of the entity after the time nT has elapsedi,yi,zi) And a position vector
Figure RE-RE-GDA0003428787490000031
A first calculation module for calculating the displacement vector of the entity at the moment
Figure RE-RE-GDA0003428787490000032
And velocity vector
Figure RE-RE-GDA0003428787490000033
A second calculation module for calculating the position vector of the next frame entity
Figure RE-RE-GDA0003428787490000034
That is, the position coordinate of the next frame of the entity is (x)i+1,yi+1,zi+1)。
The invention has at least the following beneficial technical effects:
the method for eliminating the visual jitter of the distributed countermeasure simulation system can enhance the real-time performance of the visual simulation node data and solve the problem of visual image jitter, can be realized by optimizing a system software algorithm, is low in cost, and can be popularized and applied in the similar distributed interactive simulation system.
Drawings
Fig. 1 is a flowchart of a method for eliminating visual jitter of a distributed countermeasure simulation system according to an embodiment of the present application.
Detailed Description
In order to make the implementation objects, technical solutions and advantages of the present application clearer, the technical solutions in the embodiments of the present application will be described in more detail below with reference to the drawings in the embodiments of the present application. In the drawings, the same or similar reference numerals denote the same or similar elements or elements having the same or similar functions throughout. The described embodiments are a subset of the embodiments in the present application and not all embodiments in the present application. The embodiments described below with reference to the drawings are exemplary and intended to be used for explaining the present application and should not be construed as limiting the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application. Embodiments of the present application will be described in detail below with reference to the accompanying drawings.
In the description of the present application, it is to be understood that the terms "center", "longitudinal", "lateral", "front", "back", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", and the like indicate orientations or positional relationships based on those shown in the drawings, and are used merely for convenience in describing the present application and for simplifying the description, and do not indicate or imply that the referenced device or element must have a particular orientation, be constructed in a particular orientation, and be operated, and therefore should not be construed as limiting the scope of the present application.
The present application is described in further detail below with reference to fig. 1.
The first aspect of the application provides a method for eliminating visual jitter of a distributed countermeasure simulation system, which comprises the following steps:
the method comprises the following steps of firstly, continuously acquiring entity position data in a distributed countermeasure simulation system network, and locally storing the entity position data;
step two, entity position prediction is carried out according to the locally stored entity position data to obtain entity position data in the next frame of visual picture;
step three, before rendering the view picture of the next frame, judging whether entity position data obtained from the distributed countermeasure simulation system network is updated;
if yes, rendering a next frame of visual picture according to the updated entity position data;
if not, rendering the next frame of visual picture according to the predicted entity position data.
The method for eliminating the visual jitter of the distributed countermeasure simulation system comprises the steps of firstly receiving subscribed entity position data from a network of the distributed countermeasure simulation system, locally and synchronously storing the subscribed entity position data, then predicting the entity position by adopting an extrapolation algorithm according to historical data of the locally stored entity position, judging whether the entity position data received from the network at the moment is updated or not before rendering the next frame of visual picture, and rendering the next frame of visual picture according to the network updating data if the entity position data received from the network at the moment is updated according to a judgment result; and if the entity position data transmitted by the network at the moment is not updated, using the entity position data predicted by the local historical position data at the next moment to render the next frame of visual picture.
In a preferred embodiment of the present application, the process of performing entity location prediction according to locally stored entity location data to obtain entity location data in a next frame of view picture includes:
setting a simulation step length T;
obtaining initial position coordinates (x) of an entity0,y0,z0) And an initial positionVector
Figure RE-RE-GDA0003428787490000041
Obtaining the position coordinates (x) of the entity after the time nT has elapsedi,yi,zi) And a position vector
Figure RE-RE-GDA0003428787490000042
Calculating the displacement vector of the entity at the moment
Figure RE-RE-GDA0003428787490000043
And velocity vector
Figure RE-RE-GDA0003428787490000044
Calculating to obtain the position vector of the next frame entity
Figure RE-RE-GDA0003428787490000045
That is, the position coordinate of the next frame of the entity is (x)i+1,yi+1,zi+1)。
Based on the above method for eliminating visual jitter of the distributed countermeasure simulation system, a second aspect of the present application provides a visual node, where the visual node includes:
the system comprises an entity position data acquisition module, a data storage module and a data processing module, wherein the entity position data acquisition module is used for continuously acquiring entity position data in a distributed countermeasure simulation system network and locally storing the entity position data;
the entity position prediction module is used for carrying out entity position prediction according to locally stored entity position data to obtain entity position data in a next frame of visual picture;
the scene picture rendering module is used for judging whether entity position data obtained from the distributed countermeasure simulation system network is updated or not before rendering the next frame of scene picture;
if yes, rendering a next frame of visual picture according to the updated entity position data;
if not, rendering the next frame of visual picture according to the predicted entity position data.
In a preferred embodiment of the present application, in the entity location prediction module, an extrapolation algorithm is used for entity location prediction.
In a preferred embodiment of the present application, the entity location prediction module comprises:
the step length setting unit is used for setting a simulation step length T;
a position coordinate acquiring unit for acquiring initial position coordinates (x) of the entity0,y0,z0) And an initial position vector
Figure RE-RE-GDA0003428787490000051
Obtaining the position coordinates (x) of the entity after the time nT has elapsedi,yi,zi) And a position vector
Figure RE-RE-GDA0003428787490000052
A first calculation module for calculating the displacement vector of the entity at the moment
Figure RE-RE-GDA0003428787490000053
And velocity vector
Figure RE-RE-GDA0003428787490000054
A second calculation module for calculating the position vector of the next frame entity
Figure RE-RE-GDA0003428787490000055
That is, the position coordinate of the next frame of the entity is (x)i+1,yi+1,zi+1)。
The method for eliminating the visual jitter of the distributed countermeasure simulation system and the visual nodes can effectively solve the problem of visual image jitter caused by insufficient real-time performance of data in the simulation system, and improve the immersion of the visual simulation node images, and is low in cost and high in practicability. The method can be realized by optimizing a system software algorithm, is low in cost, and can be popularized and applied in the same type of distributed interactive simulation system.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions that can be easily conceived by those skilled in the art within the technical scope of the present application should be covered within the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (6)

1. A method for eliminating visual jitter of a distributed countermeasure simulation system is characterized by comprising the following steps:
the method comprises the following steps of firstly, continuously acquiring entity position data in a distributed countermeasure simulation system network, and locally storing the entity position data;
step two, entity position prediction is carried out according to the locally stored entity position data to obtain entity position data in the next frame of visual picture;
step three, before rendering the view picture of the next frame, judging whether entity position data obtained from the distributed countermeasure simulation system network is updated;
if yes, rendering a next frame of visual picture according to the updated entity position data;
if not, rendering the next frame of visual picture according to the predicted entity position data.
2. The vision jitter elimination method of distributed countermeasure simulation system according to claim 1, wherein in the second step, an extrapolation algorithm is used for entity position prediction.
3. The method for eliminating visual jitter of a distributed countermeasure simulation system according to claim 2, wherein in the second step, the step of performing the entity location prediction according to the locally stored entity location data to obtain the entity location data in the next frame of the visual frame includes:
setting a simulation step length T;
obtaining initial position coordinates (x) of an entity0,y0,z0) And an initial position vector
Figure FDA0003272695190000011
Obtaining the position coordinates (x) of the entity after the time nT has elapsedi,yi,zi) And a position vector
Figure FDA0003272695190000012
Calculating the displacement vector of the entity at the moment
Figure FDA0003272695190000013
And velocity vector
Figure FDA0003272695190000014
Calculating to obtain the position vector of the next frame entity
Figure FDA0003272695190000015
That is, the position coordinate of the next frame of the entity is (x)i+1,yi+1,zi+1)。
4. A view node, comprising:
the system comprises an entity position data acquisition module, a data storage module and a data processing module, wherein the entity position data acquisition module is used for continuously acquiring entity position data in a distributed countermeasure simulation system network and locally storing the entity position data;
the entity position prediction module is used for carrying out entity position prediction according to locally stored entity position data to obtain entity position data in a next frame of visual picture;
the scene picture rendering module is used for judging whether entity position data obtained from the distributed countermeasure simulation system network is updated or not before rendering the next frame of scene picture;
if yes, rendering a next frame of visual picture according to the updated entity position data;
if not, rendering the next frame of visual picture according to the predicted entity position data.
5. The view node of claim 4, wherein the entity location prediction module uses an extrapolation algorithm to perform entity location prediction.
6. The view node of claim 5, wherein the entity location prediction module comprises:
the step length setting unit is used for setting a simulation step length T;
a position coordinate acquiring unit for acquiring initial position coordinates (x) of the entity0,y0,z0) And an initial position vector
Figure FDA0003272695190000021
Obtaining the position coordinates (x) of the entity after the time nT has elapsedi,yi,zi) And a position vector
Figure FDA0003272695190000022
A first calculation module for calculating the displacement vector of the entity at the moment
Figure FDA0003272695190000023
And velocity vector
Figure FDA0003272695190000024
A second calculation module for calculating the position vector of the next frame entity
Figure FDA0003272695190000025
That is, the position coordinate of the next frame of the entity is (x)i+1,yi+1,zi+1)。
CN202111106674.6A 2021-09-22 2021-09-22 View jitter elimination method and view node of distributed countermeasure simulation system Active CN114063465B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111106674.6A CN114063465B (en) 2021-09-22 2021-09-22 View jitter elimination method and view node of distributed countermeasure simulation system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111106674.6A CN114063465B (en) 2021-09-22 2021-09-22 View jitter elimination method and view node of distributed countermeasure simulation system

Publications (2)

Publication Number Publication Date
CN114063465A true CN114063465A (en) 2022-02-18
CN114063465B CN114063465B (en) 2024-05-03

Family

ID=80234200

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111106674.6A Active CN114063465B (en) 2021-09-22 2021-09-22 View jitter elimination method and view node of distributed countermeasure simulation system

Country Status (1)

Country Link
CN (1) CN114063465B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101197647A (en) * 2006-12-13 2008-06-11 四川川大智胜软件股份有限公司 Multi-channel real-time three-dimensional vision rendering indication method
US20170163502A1 (en) * 2015-12-04 2017-06-08 CENX, Inc. Classifier based graph rendering for visualization of a telecommunications network topology
CN109002666A (en) * 2018-09-18 2018-12-14 北京华如科技股份有限公司 Emulated computation method based on DR second order algorithm and DDS-QOS
CN109100723A (en) * 2018-07-25 2018-12-28 南京信息工程大学 Upper-level winds inversion method based on Doppler radar data
CN110784299A (en) * 2019-10-25 2020-02-11 北京东方瑞丰航空技术有限公司 Low-delay multichannel visual and flight simulation synchronization method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101197647A (en) * 2006-12-13 2008-06-11 四川川大智胜软件股份有限公司 Multi-channel real-time three-dimensional vision rendering indication method
US20170163502A1 (en) * 2015-12-04 2017-06-08 CENX, Inc. Classifier based graph rendering for visualization of a telecommunications network topology
CN109100723A (en) * 2018-07-25 2018-12-28 南京信息工程大学 Upper-level winds inversion method based on Doppler radar data
CN109002666A (en) * 2018-09-18 2018-12-14 北京华如科技股份有限公司 Emulated computation method based on DR second order algorithm and DDS-QOS
CN110784299A (en) * 2019-10-25 2020-02-11 北京东方瑞丰航空技术有限公司 Low-delay multichannel visual and flight simulation synchronization method

Also Published As

Publication number Publication date
CN114063465B (en) 2024-05-03

Similar Documents

Publication Publication Date Title
CN108898630B (en) Three-dimensional reconstruction method, device, equipment and storage medium
US10630956B2 (en) Image processing method and apparatus
CN108379832B (en) Game synchronization method and device
CN106846467B (en) Entity scene modeling method and system based on optimization of position of each camera
CN108921951A (en) Virtual reality image display methods and its device, virtual reality device
US20080225048A1 (en) Culling occlusions when rendering graphics on computers
US20170278294A1 (en) Texture Blending Between View-Dependent Texture and Base Texture in a Geographic Information System
CN109598744A (en) A kind of method, apparatus of video tracking, equipment and storage medium
CN109831659B (en) VR video caching method and system
CN112596843A (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN111476718A (en) Image amplification method and device, storage medium and terminal equipment
CN112489225A (en) Method and device for fusing video and three-dimensional scene, electronic equipment and storage medium
CN113140034A (en) Room layout-based panoramic new view generation method, device, equipment and medium
WO2021117660A1 (en) Server, processing system, processing method, and program
WO2024051591A1 (en) Method and apparatus for estimating rotation of video, and electronic device and storage medium
CN114063465A (en) Distributed countermeasure simulation system visual jitter elimination method and visual nodes
CN108053464A (en) Particle effect processing method and processing device
CN116863078A (en) Three-dimensional human body model reconstruction method, three-dimensional human body model reconstruction device, electronic equipment and readable medium
CN111462015A (en) Map track processing method and device, electronic equipment and storage medium
CN113810755B (en) Panoramic video preview method and device, electronic equipment and storage medium
CN115205456A (en) Three-dimensional model construction method and device, electronic equipment and storage medium
KR101576107B1 (en) Systems and methods for data synchronization in a network application
Debattista et al. Accelerating the Irradiance Cache through Parallel Component-Based Rendering.
CN112561995A (en) Real-time efficient 6D attitude estimation network, construction method and estimation method
Aggarwal et al. High-fidelity interactive rendering on desktop grids

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant