CN112419499B - Immersive situation scene simulation system - Google Patents

Immersive situation scene simulation system Download PDF

Info

Publication number
CN112419499B
CN112419499B CN202011273978.7A CN202011273978A CN112419499B CN 112419499 B CN112419499 B CN 112419499B CN 202011273978 A CN202011273978 A CN 202011273978A CN 112419499 B CN112419499 B CN 112419499B
Authority
CN
China
Prior art keywords
situation
scene
data
module
terrain
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011273978.7A
Other languages
Chinese (zh)
Other versions
CN112419499A (en
Inventor
俞信
胡岩峰
张翌庸
廉海明
王毅
李佳航
王晓烨
彭熊清
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Research Institute Institute Of Electronics Chinese Academy Of Sciences
Original Assignee
Suzhou Research Institute Institute Of Electronics Chinese Academy Of Sciences
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Research Institute Institute Of Electronics Chinese Academy Of Sciences filed Critical Suzhou Research Institute Institute Of Electronics Chinese Academy Of Sciences
Priority to CN202011273978.7A priority Critical patent/CN112419499B/en
Publication of CN112419499A publication Critical patent/CN112419499A/en
Application granted granted Critical
Publication of CN112419499B publication Critical patent/CN112419499B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Computer Graphics (AREA)
  • Databases & Information Systems (AREA)
  • Remote Sensing (AREA)
  • General Engineering & Computer Science (AREA)
  • Geometry (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Hardware Design (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention provides an immersive situation scene simulation system.A natural environment construction module is used for processing GIS geographic information data to obtain terrain data which can be directly used by a unreal 4 engine, generating a terrain and a landscape, and then a physical engine is used to endow the terrain and the landscape with physical properties to construct a scene of the situation scene simulation system; the situation visualization construction module is used for receiving situation data sent by the service, analyzing attribute information of the situation data, configuring relevant situation units for the situation data based on a virtual 4 engine UDK toolkit and finishing visualization processing; the scene browsing module is used for providing a tool for multi-angle observation in a scene, performing space conversion according to a set moving mode, and bringing users into different scenes. The invention highlights the simulation and the carrier control of natural environments such as weather, plants and the like, can bring immersive experience to users, and has stronger visual experience.

Description

Immersive situation scene simulation system
Technical Field
The invention relates to the field of graphic calculation and the field of physical simulation, in particular to an immersive situation scene simulation system.
Background
The situation display system generally comprises a front-end display system and a back-end service function, wherein the front end has the functions of receiving, processing and displaying situation data and also has a necessary geographic information system and an operation system. Backend services typically have data processing and distribution functions. A large amount of data is acquired through satellite images, positioning equipment measurement and on-site survey and collection of personnel, and finally comprehensive multimedia information data is formed. The situation display system can display the real meaning of the data on the display device in a data simulation visualization mode.
With the multiple revolutionary upgrading of computer graphics, the maturity of physical engines and the general improvement of computer hardware performance, the situation display system can be used for displaying the moving states of vehicles such as ships, airplanes and vehicles on a map, can help people to display some necessary GPS positioning monitoring, and can be used for helping people to improve the intuitive understanding of situation positions in many fields such as taxi, marine navigation and aviation. A person can move around a map on a large electronic screen, based on the background of the map, and then see a point, each point corresponding to one or more situation units. Most of the prior situation display systems still stay in 2D or simulated 3D environments, and simulate and express the existence of a situation unit by a point, a block or a model, and the situation display systems show situation information to people on a plane, but have no simulation concept.
The simulation situation display system has the most intuitive feeling when people see the simulation system by using all elements. Therefore, there is a continuing need to design an immersive situation scene simulation system that uses real geographic information, excellent rendering techniques, and powerful physics engines to simulate all the units in a situation display system to see the necessary situation units, such as the location, motion, and heading of a helicopter, in a large scene.
Disclosure of Invention
The invention aims to provide an immersive situation scene simulation system.
The technical solution for realizing the purpose of the invention is as follows: an immersive situational scene simulation system comprising:
the natural environment construction module is used for processing GIS geographic information data to obtain topographic data which can be directly used by the unreal 4 engine, generating a topographic landscape, and then endowing the topographic landscape with physical properties by using the physical engine to construct a scene of a situational scene simulation system;
the situation visualization construction module is used for receiving situation data sent by a service, analyzing attribute information of the situation data, configuring relevant situation units for the situation data based on a virtual 4 engine UDK toolkit and finishing visualization processing;
and the scene browsing module is used for providing a tool for multi-angle observation in a scene, performing space conversion according to the set moving mode, and bringing the user into different scenes.
Further, the natural environment construction module comprises a terrain construction module, a tree construction module, a terrain material construction module, a weather construction module, a sky construction module and a sea construction module, wherein:
the terrain construction module is used for carrying out proportion adjustment and format conversion on GIS geographic information data and creating a terrain through a flow gate card mode, wherein in the proportion adjustment process, the horizontal proportion is not modified at all, and the height proportion modification needs to ensure the height difference of the terrain; in the format conversion process, the elevation map is originally in the tif format and needs to be converted into an R16 format which can be used by an unreal 4 engine;
the tree construction module is used for setting a plant model and realizing tree planting on the surface of the constructed terrain by using a map brush or a programmed plant generator;
the terrain material construction module is used for performing slope judgment and altitude judgment, determining textures of different terrains, generating materials of different terrains and applying the materials to the surface of the constructed terrain;
the weather construction module is used for constructing different weather conditions based on the entity and/or the screen effect;
the sky construction module is used for realizing different sky effects on the sky ball based on the actual texture effect and the screen effect;
and the ocean construction module is used for realizing the color and light reflection of the ocean entity by using the rendering engine, realizing the water surface and underwater effects by using the screen effect and realizing the ocean interactivity by using the physical engine.
Further, the situation visualization construction module comprises a situation data receiving module, a situation class preparing module and a situation rendering module, wherein:
the situation data receiving module is used for realizing situation information receiving, situation information processing and situation display control;
the situation unit data storage class is used for storing data of each situation unit; the storage queue Map is used for storing situation unit data storage classes; the situation unit base class has carrier capability in the UE4 and is an object rendered and displayed in the scene;
the situation rendering module is used for analyzing the obtained situation data character string, judging the primary key ID of the situation unit data, acquiring uniquely identified situation unit information, creating a situation unit data storage class object according to the situation unit information, storing the situation unit data storage class object in the queue Map, and moving, creating or deleting the situation unit base class based on the situation unit data storage object in the Map.
Further, the scene browsing module comprises an overhead viewing angle rover, a following rover, a first person rover, a bird's-eye view map and a scene switching tool, wherein:
the overhead view rover is used for viewing situation units and geographic scenes from high altitude;
the following rover is used for following the movement of the specified object and setting the position of the following object in each frame;
the first-person rover is used for scene exploration and interaction with other situation units, wherein the scene exploration comprises distance measurement and position information acquisition, and the interaction content comprises collision blocking, cannonball feedback and unit pickup;
the aerial view map is used for reflecting scene content and situation content around the observation position in time;
the scene switching tool comprises a sky control interface, a situation unit panel, a self-position display interface and a compass interface, wherein the sky control interface is used for controlling a sky ball; the situation unit panel is used for reading the position of the situation unit and controlling the position of the rover to move to the position of the situation unit; the self-position display interface is used for acquiring longitude and latitude coordinates corresponding to the space position of the rover; the compass interface is used to indicate directional information.
An immersive situation scene simulation method is used for carrying out immersive situation scene simulation based on the system.
Compared with the prior art, the invention has the following remarkable advantages: 1) The situation display system has an excellent structure, the expandability of the existing situation display system is extremely low, and the increasingly rich situation display scene and content requirements cannot be met fundamentally, and the overall structures of the situation display system are mutually independent and mutually auxiliary, so that the situation display system is easy to expand and update; 2) The method has the advantages of real environment effect, immersive experience and strong visual experience, and can directly acquire the content of the information and be simply applied to quickly acquire the information when the method is used. In the prior situation display system, authenticity is rarely put at the head, most of authenticity is visually expressed, or simulation demonstration is carried out, so that the system rarely has a full real display effect; 3) The invention is suitable for various environments and data contents, and in view of the advantage of strong coupling, the invention is not limited to where the situation display system is only suitable for, but can conveniently modify the data source structure and the geographic position, and can be conveniently applied to other situation demonstration requirements.
Drawings
FIG. 1 is an architectural diagram of an immersive situation scene simulation system of the present invention.
FIG. 2 is a flow chart of situation data processing.
Fig. 3 is an architecture diagram of a conventional situation display system.
Fig. 4 is a diagram comparing the present invention and a conventional situation display system.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
The invention is developed by using a phantom 4 engine (hereinafter referred to as 'UE 4') platform, a large amount of related work is designed around a module plug-in structural formula, as shown in figure 1, and an immersive situation scene simulation system mainly comprises a natural scene construction module, a situation visualization module and a scene browsing module.
Natural environment construction module
And the natural environment construction module is used for processing GIS geographic information data to obtain terrain data which can be directly used by the unreal 4 engine, generating a terrain and a landscape, and endowing the terrain and the landscape with physical properties by using the physical engine to construct a scene of the situational scene simulation system. The natural environment construction module comprises 6 modules of terrain construction, tree construction, terrain material construction, weather construction, sky construction and ocean construction, and the implementation mode of each module is described below.
(1) Terrain building module
The UE4 platform does not support elevation data in tif format, and needs to use a mapping tool for format conversion. Global Mapper and World Machine are both mapping tools, global Mapper is a planar mapping tool, and World Machine is a three-dimensional mapping tool.
hfz generally builds a high quality 3D world in a game, which is a compressed high-level format map that can be converted to hfz format using Global Mapper. If the local map needs to be intercepted, the Global Mapper can be used for outputting after interception.
The R16 format is an RAR compressed file conversion format on a multi-volume archive, which can be used by the UE4 and generates a flow gateway, which is a more ideal solution, and the hfz format map can be converted into the R16 format using World Machine. When the World Machine derives the R16 terrain, a format of cutting into 16 blocks (4*4) can be selected, and terrain segmentation of other block numbers can be output, such as 1*1, 2*2, 3*3 and 5*5 and other block numbers. The resolution of the map may be set in the World Machine, and may not be higher than 256 × 256, otherwise it cannot be applied in the UE 4. In the World Machine, the effects of weathering and erosion of the terrain and the like are set, and the details of the terrain can be increased.
The UE4 creates the terrain through the stream gate mode, 16 terrains can be combined into a whole, each terrain of the gate can detect the distance from a user camera, when the distance between the user camera and the user camera exceeds a critical value, distant terrain blocks cannot be displayed, and the mode can reduce hardware rendering pressure.
In addition, in data processing, the horizontal proportion can not be modified, and the height proportion can be modified in a proper amount according to actual effects and requirements so as to ensure the height difference of the terrain, which is generally set to 150%.
(2) Tree construction module
There are two methods to plant trees on the terrain surface within the UE 4:
A. a map brush is used. And (3) introducing the required plant model into the brush, applying the plant function of the UE4, setting various numerical values of the brush, and generating trees on the terrain.
B. A programmed plant generator is used. Compared with a brush, the tree planting capability with a larger area and a more random generation algorithm are achieved. The programmed plant builder is a functional module in the UE4 that allows the developer to control the randomness of the tree planting through numerical settings.
The programmed plant generator may not use the tree model directly, but needs to use a "vegetation type" structure, set the plant model in a as a member of the vegetation type, and then use the programmed plant generator. The programmed plant generator can set multiple vegetation types simultaneously, so that by setting weights, more real and random trees can be planted. The programmed plant generator is a rectangular parallelepiped wire frame in the UE4, and by setting the type of the object, trees will be planted on the surface of the object within the wire frame. The programmed plant generator also has the settings of orientation, density, size proportion weight, vegetation type weight and the like, and can further specify the planting mode.
(3) Terrain material construction module
The material used in the invention mainly refers to the material used in the terrain, and has the following characteristics: the self-adaptive slope has a vegetation face different from that of the flat ground; self-adaptive altitude, and vegetation differentiation of plateaus and plains; the Texture (Material) includes a plurality of textures (Texture), a Texture used for a topographic surface, and a Texture for rendering a mat. The former is used for ground mapping, and the latter can be processed to become a grass texture standing on the ground; the texture is applied to the constructed terrain, and the surface is programmed according to the blueprint to generate the bushes and the mapping of the ground.
(4) Weather construction module
Weather is generated based on an entity, and can also be realized based on a screen effect, wherein:
A. the weather of the entity is rendered using the particle effect. A particle special effect is created in the UE4, and the particle special effect can generate a tiny particle according to the PNG picture. The particles have the properties of gravity, initial velocity, acceleration and the like, provide collision detection and support LOD display. By setting the particles to be rain or snow, and other attributes, rain and snow weather can be achieved. The particle effect also has a particle generation range, and a user can limit the particle effect display range near the camera position.
B. The screen effect is used. And displaying the weather effect on a screen, wherein the weather effect does not exist in the scene. The screen effect requires the creation of a HUD Widget, which is an interface control in the UE4 and can be used to set the screen effect.
C. The combination of the physical weather and the screen effect is used, the particle special effect is used in a close space, and meanwhile, the particle special effect is rendered and displayed on the screen, so that the performance and the simulation effect can be balanced.
Particle effects and screen effects can be called by the operator inheritance class of any UDK. And a weather controller is created, weather selection can be carried out by receiving weather data transmitted by the background service, and the current weather is set. And after the real weather data is connected, the corresponding weather effect is displayed in a self-adaptive mode, and the current weather state is fed back in real time.
(5) Sky construction module
The sky effect is realized based on the sky ball. The sky dome is an important scene component in the UE4, and covers the entire scene, mainly with a Texture (Texture) effect and a screen effect, to meet various sky display requirements.
(6) Ocean construction module
The invention uses the rendering engine to simulate the ocean, and the ocean system has various implementations in game design. The key point of the invention is to combine the situation unit with the marine system, so that the motion attitude of the ship situation unit is more real. Liquid effect simulation is a difficult point of the current game engine, and liquid is usually simulated by using a mode of combining a screen effect and a rendering engine. The simulation method of the invention for the ocean comprises the following steps:
A. and using a rendering engine to create a marine entity and adjust the color and reflection of the sea. The ocean surface is shown by Material (Material), similar to landform Material, and is covered by ocean Material. Texture maps can be set in the texture, and the layering sense of the water surface texture can be enhanced by using a normal map mode. The material surface can be set to be wave display, and the ocean surface can present the wave fluctuation height through programming. The height of the ocean surface is obtained, and an external interface is required to be provided for obtaining the ocean height.
B. And the screen effect is used for expressing the water surface and underwater effects. Using a post processing box (PostProcessVolume) control in the UE4, on-screen effect display can be provided, which is different from the previous weather effect implementation method, and the weather effect uses the HUD Widget for on-screen effect processing. Here, a color filter is set in front of the screen, a blue filter in the sea, a filter for displaying distortion, and the like are set. The post-processing box will produce a filter screen visual effect only when the camera is submerged.
C. Using a physics engine, the interactivity of the ocean is added. The ocean has the characteristics of buoyancy, resistance, speed and the like, and has interactivity with the situation units and the user. After the ocean height range is obtained, whether the current position is underwater can be judged. A buoyancy module is provided for applying buoyancy and detecting underwater conditions. The floating function of the ship on the water surface can be realized after each ship situation unit is associated with the buoyancy assembly.
(II) situation visualization construction module
The situation visualization construction module is used for receiving situation data sent by the service, analyzing attribute information of the situation data, configuring relevant situation units for the situation data based on a virtual 4 engine UDK toolkit, and finishing visualization processing. The situation visualization construction module comprises a situation data receiving module, a situation preparation module and a situation rendering module, and the implementation modes of the modules are described below.
(1) Situation data receiving module
The invention uses the zeroMQ to receive situation data, and the system belongs to a subscriber in a subscription and publishing mode. The traditional situation display system processes situation data, starting from situation information generation, collecting and sorting, and finally displaying on equipment. The key points of the invention are situation information receiving, situation information processing, situation display control, situation unit display and situation data query. Referring to fig. 2, the situation data receiving module implements the functions of situation information receiving, situation information processing and situation display control, and is implemented by inheriting a C + + class, i.e., a situation visualization class created by an Actor based on the UDK.
After creating a connection with a server according to a subscriber mode, a situation data character string sent from the service is received in BeginPlay () of a class, and situation data is continuously received by using a loop body. Based on the data characteristics of ZMQ, the situation data string sequentially records the information of each situation unit in a cycle.
The structure of the situation data is as follows:
[ situation unit name;
a situation unit number;
a situation unit type;
a situational unit longitude and latitude position;
whether the situational units have been displayed;
organization to which the situation unit belongs;
situation unit last update date ].
The situation unit serial number is a primary key ID and is a unique identifier of the situation unit. And the situation unit type is used for judging the type of the unit. Whether the situation units are already displayed or not is auxiliary data required by the invention. And the last updating date is used for judging the timeliness of the data. The names of the situation units, the longitude and latitude positions of the situation units and the organizations to which the situation units belong are business information. Based on business requirements, the data structure can expand more information.
(2) Situation preparation module
The situation unit data storage class is used for storing data of each situation unit; the storage queue Map is used for storing the situation unit data storage class; and the situation unit base class inherits the Pawn class in the UDK, has the carrier capability in the UE4 and is an object rendered and displayed in the scene.
The situation units inherit the situation unit base class according to different types, and a situation unit subclass (namely a situation unit data storage class) is created. The situation unit types comprise aircrafts, ships, submarines and vehicles, and the situation unit types are judged in a BeginPlay () function of the situation unit base class, so that corresponding situation unit subclasses are created. If no category is set for the situation unit, a default category is used. The situation unit subclass has different skeeletonmesh for storing the bone model. The situation unit subclass stores different types of attributes and provides an external interface with characteristics, and each situation unit completes different behaviors in the interface. The situation unit sub-class action interface can be invoked by either the rover or the UMG interface.
And storing the common attribute of the situation units in the situation unit base class. The situation unit base class inherits the Brown class, rewrites the mobile component, providing an external interface MoveTo (Vector 3D DestinationPoint). The situation visualization class calls the interface to perform mobile control on the situation unit base class. In the MoveTo () function, the position of the situation unit is firstly obtained, the included angle between the situation unit and a target position (DestinationPoint) is obtained through spatial position calculation, the cruise angle is obtained in the horizontal direction, and the pitch angle is obtained in the height direction. The target position is a longitude and latitude coordinate, and the longitude and latitude coordinate is different from the coordinate in the UE4 scene, so that the coordinate conversion is required to be performed based on a map projection mode to obtain the UE4 scene coordinate. In the situation unit base, there are variables such as Speed (Speed), situation unit Type (Type), and whether or not there is a height shift (IsHorizontal). In the Tick () function of the situation unit base class, the calculation of the position of the situation unit of each frame is completed, the position in the horizontal direction is calculated by the cruise angle and the speed, the position in the height direction is calculated by the pitch angle and the speed, and the position of the unit is set by using the setacterlocation () function. And simultaneously, setting the cruise angle of the unit by using a setacorrotation () function, wherein the angle points to the target point. If the height is not moved (IsHorizontal = true), the height position is set to the self position (the height is not set) in the Tick (), and the situation unit receives gravity and moves on the ground based on the physical characteristics. And adding a buoyancy component to the situation unit base class, and if the IsHorizontal = true and the situation unit base class is in the ocean range, floating the situation unit on the water surface under the buoyancy.
(3) Situation rendering module
The situation rendering module is used for analyzing the obtained situation data character string, judging the primary key ID of the situation unit data and acquiring uniquely identified situation unit information; and creating a situation unit data storage class object according to the situation unit information, and storing the situation unit data storage class object in a queue Map. And judging whether the received character string and the situation data circulate to the next round or not, and if the situation data of the next round arrive, updating the behavior of the situation unit base class. And moving, creating or deleting the situation unit base class based on the situation unit data storage object in the Map. The movement mode of the situation unit includes horizontal movement and height movement.
(III) scene browsing module
And the scene browsing module is used for providing a tool for multi-angle observation in a scene, performing space conversion according to the set moving mode, and bringing the user into different scenes. The invention keeps the functions of the traditional situation display system and focuses more on simulation and interaction so as to improve the sense of reality and the sense of immersion. Conventional situation display systems are used only to view situation units and are not interactive in design. The present invention provides immersive interaction with situational units, which is the biggest difference from conventional systems. The 3D scene browsing function requires a plurality of roamers to be implemented together. The invention adopts the mode of assistance of a rover and a UI interface to increase the observation capability of the scene, and refers to the operation and control in the attached figure 3. The scene browsing module comprises: the system comprises an overhead visual angle rover, a following rover, a first person rover, a bird's-eye view map and a scene switching tool, and the implementation modes of all modules are described below respectively.
(1) Overhead visual angle roaming device
The sky view rover is the default rover. The rover is used for viewing situation units and geographic scenes from high altitude and is realized through the Brown class inheritance of the UDK. The sky view rover has the functions of freely rotating the view angle and freely moving, and the camera and the moving function of the rover need to be realized by custom programming. The rover provides quick operation modes such as visual angle zooming, speed doubling movement and the like, and is realized by adding corresponding keyboard and mouse event input modes.
(2) Following roaming device
The follower rover is structurally similar to the sky view rover, and inherits the Brown class in the UDK. The main function of the following rover is to follow the movement of a specified object and is realized by setting the position of the following rover in each frame. The following rover needs an Actor queue for storing the object appointed to follow, is set to Public, can set the followed object in the scene, and then obtains the position of the appointed object through the index value. The index value is changed by the way of keyboard input, and the effect of object switching is achieved. The follower rover is capable of interacting with the followed object, which is typically a situational unit. In the program, obtaining the reference of the followed object can call the action interface of the followed object. The following rover receives the event input and calls an action interface of the followed object to control the behavior of the situation unit.
(3) First person roaming device
The first-person rover inherits the Character or Pawn in the UDK. A Character mode, which needs to inherit the Character class; vehicle mode, inherits the paw class. This rover needs to support scene exploration, as well as interaction with other situational units. The scene exploration function comprises ranging and position information acquisition: the distance measurement function is that a unit sends out detection rays to a distance to perform linear collision detection to obtain a collision position, the distance between two points in a scene is obtained after the position difference value of the collision position and a rover is calculated, and then the actual distance is obtained through space information conversion; the position information is obtained, and the real longitude and latitude of the position, the concrete content of the formula, the projection mode and the boundary of the map terrain which need to be considered, and the horizontal scaling can be obtained through formula conversion. Interactive content includes collision blocking, cannonball feedback and unit pickup: collision blocking, namely starting collision detection of the first person rover and the situation unit, and setting the first person rover as Block in the attribute; the method comprises the following steps that cannonball feedback is carried out, cannonballs launched by a first person rover inherit an interactive control piece of a situation unit, and the control piece is used for detecting collision with the situation unit and calculating unit breakage; and (3) unit pickup, wherein the first person rover uses linear collision detection to pick up the situation units in the space, obtain the reference of the situation unit object and call a behavior function interface of the situation units.
The implementation mode of the switching of the rover is as follows: the current Controller can be obtained through Get Player Controller () in UDK, and the Controller calls the Possess () function to set the rover object needing to be converted, and meanwhile, releases the rover object not needed.
(4) Bird's-eye view map
The bird's-eye view map is arranged at the lower right corner of the interface, and the bird's-eye view map can reflect scene contents and situation contents around the observation position in time. On the basis of the rover, adding a scene 2D camera control, setting the position of the scene 2D, and changing the camera attribute into orthographic projection (ortho); creating a RenderTarget in the UE4 content file, wherein the RenderTarget is used for receiving pictures shot by a scene 2D; a material is then created for receiving the contents of RenderTarget. Texture may receive Texture (Texture) content, renderTarget is a type of Texture. And then, a UMG class is created, a picture (Image) control is added in the UMG, and the picture content is set as the material of the front. In the beginnplay () function of the rover, a UMG object is created, a bird's-eye map UMG class is selected, addtoview () is called, and the bird's-eye map is added to the screen interface.
(5) Scene switching tool
In addition to bird's-eye maps, there are a variety of interfaces: sky control interface, situation unit panel, self position display interface. The sky control interface is realized by creating a UMG interface and adding buttons on the interface. And adding a sky ball variable in the UMG interface control for referencing a sky ball object of the current scene. Adding the sky control interface is the same as adding the aerial view map to the screen, and before adding the AddToViewport () function to the interface in the rover, the GetAllActorsOfClass () is called, and the class is selected as the sky ball. The returned result of GetAllActorOfClass () is a sky ball array, a sky ball object is stored in the array, the sky ball is quoted by a sky control interface UMG, and the sky control interface controls the sky ball through a button.
And the situation unit panel acquires all situation units by using a GetAllActorOfClass () by using an object reference method similar to a sky control interface, and is a UMG control. And adding a ListView in the situation unit panel for storing an information column of the situation unit. The situation unit information column is also a UMG control, and is used for being added into the ListView control by the situation unit panel UMG. And each information column refers to the corresponding situation unit and is used for reading the information of the situation unit. Adding TextBlock in the situation unit information column for displaying the information of the situation unit. And adding a button control to the situation unit information bar for realizing the viewing function. And reading the position of the situation unit by clicking a button event, setting the position of the self-roaming device, and moving to the position of the situation unit.
And displaying an interface at the self position, and realizing by using the UMG control. The space position of the rover can be obtained through the GetActorlocation () function, and after the position of the rover is obtained, the corresponding longitude and latitude coordinates are obtained through formula conversion. And adding a longitude and latitude coordinate variable in the rover for recording the position of the rover. Adding TextBlock in a position display interface of the rover, and binding a numerical value with a latitude and longitude coordinate variable of the rover, namely displaying the position of the rover in real time in the TextBlock; and the compass interface is realized by using a UMG control, and the numerical value setting is similar to the position of the compass interface.
In conclusion, the invention creates a simulated situation display system by designing a module plug-in structure, integrating and fusing technologies such as GIS, remote sensing interpretation, game engine, physical simulation and the like, using the game engine illusion 4 and using the functional cohesion module. The invention further highlights the simulation of natural environments such as weather, plants and the like and the immersive interaction such as vehicle control and the like.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent application shall be subject to the appended claims.

Claims (5)

1. An immersive situation scene simulation system, comprising:
the natural environment construction module is used for processing GIS geographic information data to obtain topographic data which is directly used by the unreal 4 engine, generating a topographic landscape, and endowing the topographic landscape with physical properties by using the physical engine to construct a scene of the situational scene simulation system;
the situation visualization construction module is used for receiving situation data sent by the service, analyzing attribute information of the situation data, configuring relevant situation units for the situation data based on a virtual 4 engine UDK toolkit and finishing visualization processing;
and the scene browsing module is used for providing a tool for multi-angle observation in a scene, performing space conversion according to the set moving mode, and bringing the user into different scenes.
2. The immersive situational scene simulation system of claim 1, wherein said natural environment construction modules comprise 6 modules of terrain construction, tree construction, terrain texture construction, weather construction, sky construction, and ocean construction, wherein:
the terrain construction module is used for carrying out proportion adjustment and format conversion on GIS geographic information data and creating a terrain through a flow gate card mode, wherein in the proportion adjustment process, the horizontal proportion is not modified at all, and the height proportion modification needs to ensure the height difference of the terrain; in the format conversion process, the elevation map is originally in the tif format and needs to be converted into an R16 format which can be used by an illusion 4 engine;
the tree construction module is used for setting a plant model, and planting trees on the surface of the constructed terrain by using a map brush or a programmed plant generator;
the terrain material construction module is used for performing slope judgment and altitude judgment, determining textures of different terrains, generating materials of different terrains and applying the materials to the surface of the constructed terrain;
the weather construction module is used for constructing different weather conditions based on the entity and/or the screen effect;
the sky construction module is used for realizing different sky effects on the sky ball based on the actual texture effect and the screen effect;
and the ocean construction module is used for realizing the color and light reflection of the ocean entity by using the rendering engine, realizing the water surface and underwater effects by using the screen effect and realizing the ocean interactivity by using the physical engine.
3. The immersive situation scene simulation system of claim 1, wherein the situation visualization construction module comprises a situation data receiving module, a situation class preparation module, and a situation rendering module, wherein:
the situation data receiving module is used for realizing situation information receiving, situation information processing and situation display control;
the situation unit data storage class is used for storing data of each situation unit; the storage queue Map is used for storing situation unit data storage classes; the situation unit base class has the carrier capability in the illusion 4 engine and is an object rendered and displayed in the scene;
the situation rendering module is used for analyzing the obtained situation data character string, judging the primary key ID of the situation unit data, acquiring uniquely identified situation unit information, creating a situation unit data storage class object according to the situation unit information, storing the situation unit data storage class object in a storage queue Map, and moving, creating or deleting the situation unit base class based on the situation unit data storage object in the Map.
4. The immersive situational scene simulation system of claim 1, wherein said scene browsing module comprises an overhead perspective rover, a follow rover, a first person rover, a bird's eye view map, and a scene switching tool, wherein:
the overhead view rover is used for viewing situation units and geographic scenes from high altitude;
the following rover is used for following the movement of the specified object and setting the position of the following object in each frame;
the first-person rover is used for scene exploration and interaction with other situation units, wherein the scene exploration comprises distance measurement and position information acquisition, and the interactive content comprises collision blocking, cannonball feedback and unit pickup;
the aerial view map is used for reflecting scene content and situation content around the observation position in time;
the scene switching tool comprises a sky control interface, a situation unit panel, a self-position display interface and a compass interface, wherein the sky control interface is used for controlling a sky ball; the situation unit panel is used for reading the position of the situation unit and controlling the position of the rover to move to the position of the situation unit; the self-position display interface is used for acquiring longitude and latitude coordinates corresponding to the space position of the rover; the compass interface is used to indicate directional information.
5. An immersive situation scene simulation method, characterized in that an immersive situation scene simulation is performed based on the system of any one of claims 1 to 4.
CN202011273978.7A 2020-11-14 2020-11-14 Immersive situation scene simulation system Active CN112419499B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011273978.7A CN112419499B (en) 2020-11-14 2020-11-14 Immersive situation scene simulation system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011273978.7A CN112419499B (en) 2020-11-14 2020-11-14 Immersive situation scene simulation system

Publications (2)

Publication Number Publication Date
CN112419499A CN112419499A (en) 2021-02-26
CN112419499B true CN112419499B (en) 2022-11-29

Family

ID=74830837

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011273978.7A Active CN112419499B (en) 2020-11-14 2020-11-14 Immersive situation scene simulation system

Country Status (1)

Country Link
CN (1) CN112419499B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113421327A (en) * 2021-05-24 2021-09-21 郭宝宇 Three-dimensional model construction method and device and electronic equipment
CN114500685B (en) * 2022-01-11 2023-07-21 中国人民解放军国防科技大学 Third party communication library bridging method and system adapting to illusion engine application
CN116362045B (en) * 2023-03-31 2024-02-06 中国科学院空间应用工程与技术中心 Lunar geographic information system and lunar surface activity simulation method
CN116894000B (en) * 2023-05-29 2023-12-08 中国船舶集团有限公司第七〇七研究所 Information conversion method, device, electronic equipment and storage medium
CN116414316B (en) * 2023-06-08 2023-12-22 北京掌舵互动科技有限公司 Illusion engine rendering method based on BIM model in digital city

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030081812A1 (en) * 2001-10-26 2003-05-01 Hiromichi Yamamoto System and method of presenting altitude visibility information provision system utilizing satellite image, and system and method of recognizing flight obstacle
US20130208494A1 (en) * 2012-02-14 2013-08-15 Russell C. Jones Emergency vehicle lighting apparatus including a light bar that can be raised to increase visibility during an emergency
CN103544677A (en) * 2013-11-01 2014-01-29 中国人民解放军信息工程大学 Space-air-ground integration situational expression engine and shaking elimination method
CN103679381A (en) * 2013-12-23 2014-03-26 广东威创视讯科技股份有限公司 Collaborative plotting method and system for situation map

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030081812A1 (en) * 2001-10-26 2003-05-01 Hiromichi Yamamoto System and method of presenting altitude visibility information provision system utilizing satellite image, and system and method of recognizing flight obstacle
US20130208494A1 (en) * 2012-02-14 2013-08-15 Russell C. Jones Emergency vehicle lighting apparatus including a light bar that can be raised to increase visibility during an emergency
CN103544677A (en) * 2013-11-01 2014-01-29 中国人民解放军信息工程大学 Space-air-ground integration situational expression engine and shaking elimination method
CN103679381A (en) * 2013-12-23 2014-03-26 广东威创视讯科技股份有限公司 Collaborative plotting method and system for situation map

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
关于虚拟战场态势仿真系统设计研究;孙峥皓等;《计算机仿真》;20181215;第35卷(第12期);第309-312、324页 *

Also Published As

Publication number Publication date
CN112419499A (en) 2021-02-26

Similar Documents

Publication Publication Date Title
CN112419499B (en) Immersive situation scene simulation system
Koller et al. Virtual GIS: A real-time 3D geographic information system
Piekarski et al. Interactive augmented reality techniques for construction at a distance of 3D geometry
Piekarski Interactive 3d modelling in outdoor augmented reality worlds
CN108140260A (en) The generation of 3D models and user interface from map datum
CN111784833A (en) WebGL-based flood evolution situation three-dimensional dynamic visualization display method
US20170046878A1 (en) Augmented reality mobile application
WO2011112667A2 (en) Integrated gis system with interactive 3d interface
CN108765576B (en) OsgEarth-based VIVE virtual earth roaming browsing method
Polis et al. Automating the construction of large-scale virtual worlds
Liarokapis et al. Mobile augmented reality techniques for geovisualisation
Davis et al. CAVE-VR and unity game engine for visualizing city scale 3d meshes
CN113877210A (en) Game scene conversion method, system, server and computer readable storage medium
Spicer et al. Producing usable simulation terrain data from UAS-collected imagery
Zhou et al. Customizing visualization in three-dimensional urban GIS via web-based interaction
Greenwood et al. Using game engine technology to create real-time interactive environments to assist in planning and visual assessment for infrastructure
Bjørkli et al. Archaeology and augmented reality. Visualizing stone age sea level on location
Chen et al. Generate 3D triangular meshes from spliced point clouds with cloudcompare
Abdelguerfi 3D synthetic environment reconstruction
Giertsen et al. An open system for 3D visualisation and animation of geographic information
Wang et al. Improving Construction Demonstrations by Integrating BIM, UAV, and VR
Kaspar et al. Holographic mixed reality: an enhanced technology for visualizing and evaluating complex 3D geologic data
CN117011492B (en) Image rendering method and device, electronic equipment and storage medium
Čypas et al. Preparation of 3D digital city model development technology based on geoinformation systems
Zhou et al. Modeling and visualizing 3d urban environment via internet for urban planning and monitoring

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant