CN115909858B - Flight simulation experience system based on VR image - Google Patents

Flight simulation experience system based on VR image Download PDF

Info

Publication number
CN115909858B
CN115909858B CN202310213585.4A CN202310213585A CN115909858B CN 115909858 B CN115909858 B CN 115909858B CN 202310213585 A CN202310213585 A CN 202310213585A CN 115909858 B CN115909858 B CN 115909858B
Authority
CN
China
Prior art keywords
user
terrain
resolution
characteristic value
representing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310213585.4A
Other languages
Chinese (zh)
Other versions
CN115909858A (en
Inventor
韩权
刘敏
张猛
邱鹏
李钊滨
陈益民
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Nantianmen Network Information Co ltd
Original Assignee
Shenzhen Nantianmen Network Information Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Nantianmen Network Information Co ltd filed Critical Shenzhen Nantianmen Network Information Co ltd
Priority to CN202310213585.4A priority Critical patent/CN115909858B/en
Publication of CN115909858A publication Critical patent/CN115909858A/en
Application granted granted Critical
Publication of CN115909858B publication Critical patent/CN115909858B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention relates to the technical field of data processing, in particular to a flight simulation experience system based on VR images, which is used for obtaining all terrain subareas, obtaining a visual range area and a projection plane of a user according to the viewpoint and the main viewing direction of the user, obtaining a resolution characteristic value and a texture detail characteristic value of each terrain subarea in the visual range area, obtaining all terrain subareas and terrain elevation data matrixes of the user, calculating relative scheduling resources and relative computing resources of the user according to the terrain elevation data matrixes, distributing computer hardware resources according to the scheduling resource occupation ratio and the computing resource occupation ratio of the user, scheduling generation resources of the user to generate VR display images of the user, better reasonably distributing the computer hardware resources in a multi-user scene, and guaranteeing the image quality of VR display equipment of each user in the multi-user scene.

Description

Flight simulation experience system based on VR image
Technical Field
The invention relates to the technical field of data processing, in particular to a flight simulation experience system based on VR images.
Background
The flight simulation experience system is simulation equipment for flight training and education science popularization, and the most main application of the flight simulation experience system is training of pilots, and the flight simulation experience system is widely applied in the popularization background of civil aviation in recent years due to the advantages of safety, reliability, economy and no limitation of weather conditions. With the continuous development of VR technology, combining VR technology with the flight simulation experience system further improves the immersive performance and interactivity of the flight simulation system, so that the flight scene is more real, but with the continuous development of VR technology, higher requirements are put forward on the performance of computer-generated graphics.
The flight simulation experience system needs to rapidly process huge terrain data and texture map data to realize continuity of flight simulation pictures and avoid picture tearing, but as the requirements of users on picture quality of the flight simulation system are gradually improved at the same time, the hardware cost of a computer is also larger and larger, and how to better coordinate hardware resources of the computer under a multi-user scene so as to achieve the aim of ensuring the picture quality of a user VR display end becomes an industry difficulty to be solved urgently. In the prior art, in order to save the cost of computer hardware, the terrain data and texture data required by the flight simulation picture are generally adjusted in real time according to the change of the flight viewpoint. Under a multi-user scene, different users have different main view directions, the prior art utilizes a terrain LOD model algorithm to carry out different levels of storage scheduling on the terrain data to achieve the aim of saving computer hardware resources, but the mode cannot meet the comprehensive scheduling of the terrain data under a multi-user target, and the picture quality of each user VR display device under the multi-user scene cannot be ensured, so that a method for comprehensively scheduling the terrain data through multiple main view directions to further ensure the picture quality of the multi-user VR display device is needed.
Disclosure of Invention
The invention provides a flight simulation experience system based on VR images, which aims to solve the existing problems.
The invention relates to a flight simulation experience system based on VR images, which adopts the following technical scheme:
one embodiment of the present invention provides a VR image-based flight simulation experience system, comprising:
the terrain area blocking module is used for blocking the terrain area to obtain all terrain areas;
the regional resource dividing module divides the texture data packet to obtain texture data sub-packets of each terrain sub-region, sets different resolution levels and obtains terrain elevation data blocks of all resolution levels of each terrain sub-region;
the regional characteristic acquisition module acquires a visual range region and a projection plane of a user according to the viewpoint and the main viewing direction of the user, and acquires a resolution characteristic value and a texture detail characteristic value of each terrain subarea in the visual range region;
the regional characteristic grading module is used for grading the resolution characteristic values and the texture detail characteristic values to obtain all resolution characteristic value levels and texture detail characteristic value levels;
the user resource characteristic acquisition module acquires all the terrain sub-areas and the terrain elevation data matrixes of the user, and calculates the relative scheduling resources and the relative calculation resources of the user according to the terrain elevation data matrixes;
the user resource acquisition module takes the target level terrain elevation data blocks and the target texture data sub-packets corresponding to all the processing terrain sub-areas of the user as the generation resources of the user;
and the hardware resource allocation module is used for calculating the dispatching resource duty ratio and the computing resource duty ratio of the user, allocating the computer hardware resource according to the dispatching resource duty ratio and the computing resource duty ratio of the user, and generating the generating resource of the dispatching user and generating the VR display picture of each user.
Further, the setting of different resolution levels includes the following specific steps:
dividing the highest resolution level into the lowest resolution level for multi-resolution level division into a second preset number of resolution levels, wherein the resolution levels
Figure SMS_3
The resolution level 1 is the lowest resolution level; the number of data points in the terrain elevation data block at resolution level 1 is +.>
Figure SMS_5
Within the range, the number of data points in the terrain elevation data block at resolution level 2 is
Figure SMS_7
Within the scope, resolution level +.>
Figure SMS_2
The number of data points in the terrain elevation data block of +.>
Figure SMS_4
Within the range of>
Figure SMS_6
Representing a second presetQuantity of->
Figure SMS_8
The number of data points in the terrain elevation data block representing the lowest resolution level, +.>
Figure SMS_1
Representing the number of data points in the terrain elevation data block for the highest high resolution level.
Further, the method for acquiring the visual range area and the projection plane of the user according to the viewpoint and the main viewing direction of the user comprises the following specific steps:
when a plurality of users use the flight simulation equipment at the same time, the viewpoints of the plurality of users are the same viewpoint, and the main viewing directions of the plurality of users are a plurality of;
acquiring a visual range area on a terrain area according to the viewpoint by combining the view angle limitation of the flight simulation equipment; and combining the VR equipment information, and acquiring a projection plane of each user according to the main viewing direction of each user.
Further, the method for obtaining the resolution characteristic value and the texture detail characteristic value comprises the following steps:
acquiring the distance between the center point of each terrain subarea and the viewpoint in the visual range area, and recording the distance as the viewpoint distance of each terrain subarea; acquiring the distance between the central point of each terrain subarea and the central point of the projection plane of each user, and recording the distance as the main viewing distance between each terrain subarea and each user;
and dividing the viewpoint distance of the terrain subarea by a first value to obtain a resolution characteristic value of the terrain subarea, dividing the primary viewing distance of the terrain subarea and a user by a second value to obtain a texture detail characteristic value of the terrain subarea and the user, wherein the second value is the maximum distance from the projection plane of the user to the central point of the projection plane.
Further, the steps of obtaining all the resolution characteristic value levels and the texture detail characteristic value levels comprise the following specific steps:
the value range according to the resolution characteristic value is [0,1]Obtaining all resolution eigenvalue levelsComprising: the resolution characteristic value is in
Figure SMS_9
When the range is within the range, the resolution characteristic value is in the range of 1, and the resolution characteristic value is in the range of +.>
Figure SMS_10
In the range, the resolution characteristic value is in the range of 2, and the resolution characteristic value is in the range of +.>
Figure SMS_11
Within the range, belonging to the resolution eigenvalue class +.>
Figure SMS_12
,/>
Figure SMS_13
Representing a second preset number;
according to the value range [0,1 ] of the characteristic value of the texture detail]Obtaining all texture detail feature value levels, including: texture detail feature value is at
Figure SMS_14
When the texture detail feature value is in the range, the texture detail feature value belongs to the texture detail feature value level 1, and the texture detail feature value is in the range
Figure SMS_15
In the range, the texture detail characteristic value is in the level 2, and similarly, the texture detail characteristic value is in the range +.>
Figure SMS_16
Within the range, belonging to the texture detail eigenvalue level +.>
Figure SMS_17
,/>
Figure SMS_18
Representing a third preset number. />
Further, the step of acquiring all the processing terrain sub-areas and the terrain elevation data matrixes of the user comprises the following specific steps:
for any terrain subarea, acquiring the lowest level of the texture detail characteristic value levels of the terrain subarea and a plurality of users, and marking the user corresponding to the lowest level of the texture detail characteristic value level as a processing user of the terrain subarea; acquiring processing users corresponding to each topographic subarea in the visual range area, and distributing each topographic subarea in the visual range area to the corresponding processing user as the processing topographic subarea of the processing user;
for any one user, counting all processing terrain areas of the user, wherein the processing terrain areas belong to the resolution characteristic value level
Figure SMS_19
And belongs to the texture detail eigenvalue level +.>
Figure SMS_20
The number of the terrain subareas is used as the first +.>
Figure SMS_21
Line, th->
Figure SMS_22
Column elements.
Further, the calculation method of the relative scheduling resource comprises the following steps:
Figure SMS_23
in the method, in the process of the invention,
Figure SMS_25
indicating the relative scheduling resource of the ith user, < +.>
Figure SMS_28
Representing the number of processing terrain areas belonging to the ith user,/->
Figure SMS_30
The (th) in the terrain elevation data matrix representing the (th) user>
Figure SMS_26
Line, th->
Figure SMS_27
Column element->
Figure SMS_29
Represents the natural logarithm based on the natural constant e, < ->
Figure SMS_31
Representing a second preset number,/->
Figure SMS_24
Representing a third preset number.
Further, the computing method of the relative computing resource comprises the following steps:
Figure SMS_32
in the method, in the process of the invention,
Figure SMS_35
representing the relative computing resources of the ith user, < +.>
Figure SMS_37
Representing the number of processing terrain areas belonging to the ith user,/->
Figure SMS_39
The (th) in the terrain elevation data matrix representing the (th) user>
Figure SMS_34
Line, th->
Figure SMS_36
Column element->
Figure SMS_38
Represents the natural logarithm based on the natural constant e, < ->
Figure SMS_40
Representing a second preset number,/->
Figure SMS_33
Representing a third preset number.
Further, the method for acquiring the target level terrain elevation data block and the target texture data sub-packet comprises the following steps:
according to the resolution characteristic value level of the user processing the topographic subarea
Figure SMS_41
Obtaining a resolution level corresponding to the processing topography sub-area>
Figure SMS_42
Is recorded as a target level terrain elevation data block for processing the terrain sub-area according to the texture detail characteristic value level of the user for processing the terrain sub-area>
Figure SMS_43
Setting the utilization rate of texture data sub-packets corresponding to the processing topography sub-regions as +.>
Figure SMS_44
Target texture data sub-packet, denoted as processing terrain sub-area,>
Figure SMS_45
representing a third preset number.
Further, the calculating the scheduling resource ratio and the calculating resource ratio of the user comprises the following specific steps:
Figure SMS_46
Figure SMS_47
in the method, in the process of the invention,
Figure SMS_50
representing the scheduled resource duty cycle of the ith user, < > j->
Figure SMS_52
Indicating the relative scheduling resource of the ith user, < +.>
Figure SMS_54
Indicating the number of users simultaneously using the flight simulation device, < >>
Figure SMS_49
Representing the relative scheduling resources of the kth user; />
Figure SMS_51
Representing the computing resource duty cycle of the ith user, < ->
Figure SMS_53
Representing the relative computing resources of the ith user, < +.>
Figure SMS_55
Indicating the number of users simultaneously using the flight simulation device, < >>
Figure SMS_48
Representing the relative computing resources of the kth user. />
The technical scheme of the invention has the beneficial effects that: compared with the existing single-user flight simulation experience system, the multi-user flight simulation experience system provided by the invention acquires the vision range area and the projection plane of the user according to the viewpoint and the main vision direction of the user, acquires the resolution characteristic value and the texture detail characteristic value of each terrain subarea in the vision range area, calculates the relative scheduling resource and the relative computing resource of the user according to the terrain elevation data matrix, distributes the computer hardware resource according to the scheduling resource occupation ratio and the computing resource occupation ratio of the user, schedules the generation resource of the user to generate the VR display picture of the user, better and reasonably distributes the computer hardware resource under the multi-user scene, and ensures the picture quality of each user VR display device under the multi-user scene.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of steps of a VR image-based flight simulation experience system of the present invention;
fig. 2 is a schematic view of the visual range area and projection plane of a user of the present invention.
Detailed Description
In order to further describe the technical means and effects adopted by the present invention to achieve the preset purpose, the following detailed description refers to specific implementation, structure, features and effects of a VR image-based flight simulation experience system according to the present invention with reference to the accompanying drawings and preferred embodiments. In the following description, different "one embodiment" or "another embodiment" means that the embodiments are not necessarily the same. Furthermore, the particular features, structures, or characteristics of one or more embodiments may be combined in any suitable manner.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
The following specifically describes a specific scheme of a flight simulation experience system based on VR images provided by the invention with reference to the accompanying drawings.
Referring to fig. 1, a flowchart illustrating steps of a VR image-based flight simulation experience system according to an embodiment of the present invention is shown, where the system includes: the system comprises a terrain area partitioning module, an area resource dividing module, an area characteristic obtaining module, an area characteristic grading module, a user resource characteristic obtaining module, a user resource obtaining module and a hardware resource distributing module.
The terrain area blocking module is used for blocking the terrain area to obtain all terrain areas.
The method specifically comprises the following steps:
partitioning a terrain area of a flight simulation scene to obtain all terrain areas, wherein the method comprises the following steps: the whole terrain area of the flight simulation scene is a square area, the whole terrain area is uniformly divided into a first preset number of terrain subareas (square areas), the areas of all the terrain subareas are uniformly divided, and the areas of the terrain subareas are uniformly marked as S, and the area units of the terrain subareas are square decimeters. A first predetermined number in this embodiment
Figure SMS_56
10000, in other embodiments, the practitioner may set the first preset number as desired.
The regional resource division module is used for dividing the DEM terrain elevation data and the texture data packet according to all the terrain subareas.
The method specifically comprises the following steps:
setting different resolution levels, including: the DEM terrain elevation data of the terrain area are used for drawing the terrain of the terrain area, the DEM terrain elevation data corresponding to each terrain subarea are recorded as terrain elevation data blocks, the number of data points in the terrain elevation data blocks of different resolution levels of the terrain subarea is different, and the number of data points in the terrain elevation data blocks of higher resolution levels is larger. The highest resolution level of the terrain elevation data block is determined by the source of the DEM terrain elevation data, while the lowest resolution level of the terrain elevation data block is set by the user himself.
In the embodiment of the invention, the number of data points in the terrain elevation data block with the highest resolution level is
Figure SMS_57
Wherein S represents the area of the terrain sub-area, < ->
Figure SMS_58
Representing a first unit area, the first unit area being 0.25 square meters; the number of data points in the terrain elevation data block of the lowest resolution level is +.>
Figure SMS_59
Wherein S represents the area of the terrain sub-area, < ->
Figure SMS_60
Representing a second unit area, in this embodiment 100 square meters, and in other embodiments, the practitioner may set the second unit area as desired.
Dividing the highest resolution level into the lowest resolution level for multi-resolution level division into a second preset number of resolution levels, wherein the resolution levels
Figure SMS_62
The resolution level 1 is the lowest resolution level; the number of data points in the terrain elevation data block at resolution level 1 is +.>
Figure SMS_65
Within the range, the number of data points in the terrain elevation data block at resolution level 2 is
Figure SMS_67
Within the scope, resolution level +.>
Figure SMS_63
The number of data points in the terrain elevation data block of +.>
Figure SMS_64
Within the range of>
Figure SMS_66
Representing a second preset number,/->
Figure SMS_68
The number of data points in the terrain elevation data block representing the lowest resolution level, +.>
Figure SMS_61
Terrain elevation data block representing highest resolution levelNumber of data points.
A second predetermined number in this embodiment
Figure SMS_69
50, in other embodiments, the practitioner may set a second predetermined number as desired.
Storing terrain elevation data blocks of different resolution levels for the terrain sub-area according to a terrain LOD model algorithm, comprising: for any one terrain subarea, obtaining terrain elevation data blocks of all resolution levels of the terrain subarea, and storing the terrain elevation data blocks of each resolution level of the terrain subarea in corresponding units; the terrain elevation data blocks for all resolution levels of all terrain sub-areas are stored in corresponding cells.
The terrain area of the flight simulation scene corresponds to a texture data packet, and the texture data packet is used for rendering the terrain area after the terrain of the terrain area is drawn, and drawing specific texture details; according to the block result of the terrain area, the texture data packet is divided into a plurality of texture data sub-packets, each terrain sub-packet corresponds to one texture data sub-packet, the texture data sub-packet is used for rendering the terrain sub-packet, and the higher the utilization rate of the texture data sub-packet is, the higher the rendering degree of the terrain sub-packet is, the more the texture details of the terrain sub-packet are, and the better the visual effect is.
The regional characteristic acquisition module is used for acquiring the resolution characteristic value and the texture detail characteristic value of the terrain subarea.
The method specifically comprises the following steps:
when a plurality of users use the flight simulation equipment at the same time, the viewpoints and the main viewing directions of the plurality of users are obtained through the positioning device arranged in the flight simulation equipment, and because the viewpoints of different users on the same flight simulation equipment are approximately the same and only the main viewing directions are different, the viewpoints of the plurality of users are considered to be the same viewpoint, and the main viewing directions of the plurality of users are multiple.
The visual range area on the terrain area is obtained according to the viewpoint in combination with the view angle limitation of the flight simulation device, as shown in fig. 2, which is the prior art, and will not be described in detail here; and combining the VR equipment information, and acquiring a projection plane of each user according to the main viewing direction of each user, as shown in figure 2.
Acquiring the distance between the center point of each terrain subarea and the viewpoint in the visual range area, and recording the distance as the viewpoint distance of each terrain subarea; and acquiring the distance between the central point of each terrain subarea and the central point of the projection plane of each user, and recording the distance as the main viewing distance between each terrain subarea and each user.
The requirements of the user on the resolution level and texture details of each terrain subarea in the visual range area are related to the viewpoint distance of each terrain subarea and the main viewing distance of each terrain subarea and the user: the larger the viewpoint distance of the terrain subarea is, the farther the terrain subarea is from the viewpoint, the smaller the requirement of the user on the resolution of the terrain subarea is, namely, the larger the viewpoint distance is, the terrain elevation data blocks with lower resolution level of the terrain subarea are required to be used for forming the simulated terrain (under the main view direction) of the user; the smaller the viewpoint distance is, the closer the terrain subarea is to the viewpoint, the larger the requirement of the user on the resolution of the terrain elevation data block of the terrain subarea is, namely, the smaller the viewpoint distance is, the terrain elevation data block with the higher resolution level of the terrain subarea is required to be used for forming the simulated terrain (in the main view direction) of the user.
It should be further noted that, the larger the main viewing distance between the topographic subregion and the user, the farther the topographic subregion is from the projection plane of the user, the smaller the requirement of the user on the texture details of the topographic subregion (determined by the human eye imaging characteristics), the lower the usage rate of the texture data sub-packet, that is, the larger the main viewing distance, the fewer the texture details of the topographic subregion are required, and the lower the usage rate of the texture data sub-packet is; the smaller the main viewing distance is, the closer the topographic subarea is to the projection plane of the user, the larger the requirement of the user on the texture details (determined by the human eye imaging characteristics) of the topographic subarea is, namely the smaller the main viewing distance is, the more the texture details of the topographic subarea are required, and the higher the usage rate of the texture data subpacket is.
In summary, the requirements of the user on the resolution level and texture detail of the terrain subarea are respectively represented by the viewpoint distance of the terrain subarea and the main viewing distance of the terrain subarea and the user; the method comprises the steps that a result of dividing a viewpoint distance of a terrain subarea by a first numerical value is used as a resolution characteristic value of the terrain subarea, the first numerical value is a visual distance set by a flight simulation system, and the visual distance is determined by setting parameters of the flight simulation system; and dividing the primary viewing distance between the topographic subregion and the user by a second numerical value which is the maximum distance between the topographic subregion and the center point of the projection plane on the projection plane of the user as the characteristic value of the texture detail between the topographic subregion and the user. The resolution characteristic value and the texture detail characteristic value are normalized results.
The regional characteristic grading module is used for grading the resolution characteristic value and the texture detail characteristic value of the terrain subarea.
The method specifically comprises the following steps:
the larger the viewpoint distance of the topographic subarea is, the larger the resolution characteristic value of the topographic subarea is, and the smaller the requirement of the user on the resolution of the topographic subarea is; therefore, the larger the resolution characteristic value of the terrain subarea is, the smaller the resolution characteristic value level of the terrain subarea is, and at this time, the smaller the requirement of the user on the resolution of the terrain subarea is. Classifying the resolution characteristic values and the texture detail characteristic values, wherein the resolution characteristic values are divided into a second preset number of resolution characteristic value levels, and the value range of the resolution characteristic values is [0,1 ] as the resolution characteristic values are normalized results]All resolution eigenvalue levels are specifically: the resolution characteristic value is in
Figure SMS_70
When the range is within, the resolution characteristic value is in the resolution characteristic value level 1, and the resolution characteristic value is in the resolution characteristic value range
Figure SMS_71
In the range, the resolution characteristic value is in the range of 2, and the resolution characteristic value is in the range of +.>
Figure SMS_72
Within the range, belong to the resolution specialSign value grade->
Figure SMS_73
,/>
Figure SMS_74
Representing a second preset number, resolution characteristic value level +.>
Figure SMS_75
For the highest resolution eigenvalue level, resolution eigenvalue level 1 is the lowest resolution eigenvalue level.
The larger the main viewing distance between the topographic subarea and the user is, the larger the characteristic values of the topographic subarea and the texture details of the user are, and the smaller the requirement of the user on the texture details of the topographic subarea is; therefore, the larger the texture detail characteristic values of the terrain subarea and the user are, the smaller the texture detail characteristic value level of the terrain subarea is, and at the moment, the smaller the requirement of the user on the texture detail of the terrain subarea is. Classifying the texture detail characteristic values, wherein the texture detail characteristic values are divided into a third preset number of texture detail characteristic value levels, and the range of the texture detail characteristic values is [0,1 ] because the texture detail characteristic values are normalized results]All texture detail feature value levels are specifically: texture detail feature value is at
Figure SMS_76
When the texture detail feature value is within the range, the texture detail feature value is in the range of +.>
Figure SMS_77
In the range, the texture detail characteristic value is in the level 2, and similarly, the texture detail characteristic value is in the range +.>
Figure SMS_78
Within the range, belonging to the texture detail eigenvalue level +.>
Figure SMS_79
,/>
Figure SMS_80
Representing a third preset numberTexture detail eigenvalue level->
Figure SMS_81
For the highest texture detail feature value level, texture detail feature value level 1 is the lowest texture detail feature value level.
Third preset quantity in this embodiment
Figure SMS_82
10, in other embodiments, the practitioner may set a third predetermined number as desired.
The user resource characteristic acquisition module is used for the relative scheduling resources and the relative computing resources of the user.
The method specifically comprises the following steps:
it should be noted that, for the user, each terrain subarea in the visual range area of the user has a resolution characteristic value level and a texture detail characteristic value level, but not the terrain elevation data block corresponding to each terrain subarea is scheduled by the computing device of the user, so as to be used for forming the simulated terrain of the user; that is, for the user, it is the final objective to obtain accurate scheduling and texture drawing of each terrain sub-area within the visual range area of the user. Based on the logic, for any terrain subarea, the terrain subarea and each user have texture detail characteristic values, the terrain subarea and each user have texture detail characteristic value levels, and the terrain elevation data block corresponding to the terrain subarea should be called by the user with the lowest texture detail characteristic value level of the terrain subarea and used for forming the simulated terrain of the user, otherwise, the terrain elevation data block corresponding to the terrain subarea is important for the user.
For any terrain subarea, acquiring the lowest level of the texture detail characteristic value levels of the terrain subarea and a plurality of users, and marking the user corresponding to the lowest level of the texture detail characteristic value level as a processing user of the terrain subarea; each of the topographical sub-areas within the visual range area has a corresponding processing user, and each of the topographical sub-areas within the visual range area is assigned to the processing user as a processing topographical sub-area for the processing user.
For any one user, counting all processing terrain areas of the user, wherein the processing terrain areas belong to the resolution characteristic value level
Figure SMS_85
And belongs to the texture detail eigenvalue level +.>
Figure SMS_89
The number of treated terrain areas>
Figure SMS_93
Quantity->
Figure SMS_86
First ∈h in the terrain elevation data matrix as user>
Figure SMS_88
Line, th->
Figure SMS_92
Elements of columns, where->
Figure SMS_95
,/>
Figure SMS_83
Is common->
Figure SMS_87
Individual resolution feature value levels, in common +.>
Figure SMS_91
The level of texture detail feature values, therefore the user's terrain elevation data matrix is of the size
Figure SMS_94
,/>
Figure SMS_84
Representing a second preset number,/->
Figure SMS_90
Representing a third preset number.
The calculation formula of the relative scheduling resource of the user is specifically as follows:
Figure SMS_96
in the method, in the process of the invention,
Figure SMS_98
indicating the relative scheduling resource of the ith user, < +.>
Figure SMS_102
Representing the number of processing terrain areas belonging to the ith user,/->
Figure SMS_105
The (th) in the terrain elevation data matrix representing the (th) user>
Figure SMS_99
Line, th->
Figure SMS_100
The elements of the column (i.e. belonging to the resolution eigenvalue class +.>
Figure SMS_103
And belongs to the texture detail eigenvalue level +.>
Figure SMS_106
The number of treated terrain areas) of (1) a>
Figure SMS_97
Represents the natural logarithm based on the natural constant e, < ->
Figure SMS_101
Representing a second preset number,/->
Figure SMS_104
Representing a third preset number; processing terrain sub-areas with feature value levels belonging to different resolutionsEntropy of the number of (i) representing the relative scheduling resource of the i-th user,/the relative scheduling resource of the i-th user>
Figure SMS_107
The larger the resolution characteristic value level of the sub-region of the processing terrain of the ith user is, the more the resources need to be scheduled when the simulation terrain of the ith user is formed.
The calculation formula of the relative calculation resource of the user is specifically:
Figure SMS_108
in the method, in the process of the invention,
Figure SMS_110
representing the relative computing resources of the ith user, < +.>
Figure SMS_115
Representing the number of processing terrain areas belonging to the ith user,/->
Figure SMS_120
The (th) in the terrain elevation data matrix representing the (th) user>
Figure SMS_111
Line, th->
Figure SMS_114
The elements of the column (i.e. belonging to the resolution eigenvalue class +.>
Figure SMS_118
And belongs to the texture detail eigenvalue level +.>
Figure SMS_122
The number of treated terrain areas) of (1) a>
Figure SMS_109
Represents the natural logarithm based on the natural constant e, < ->
Figure SMS_113
Representing a second preset number,/->
Figure SMS_117
Representing a third preset number; />
Figure SMS_121
All processing terrain areas representing the ith user belonging to the texture detail feature value level +.>
Figure SMS_112
The higher the level of texture detail feature value of the processed terrain sub-area, the more +.>
Figure SMS_116
The larger the requirement of the user on the texture details of the terrain subareas is, the higher the utilization rate of the texture data sub-packets is, the calculation force required for processing the terrain elevation data blocks corresponding to the terrain subareas is +.>
Figure SMS_119
The larger the required computing resources are, the more the relative computing resources of the user are +>
Figure SMS_123
The larger.
The user resource acquisition module is used for acquiring the generated resources of the user.
The method specifically comprises the following steps:
for any user, all the processing terrain subareas belonging to the user correspond to the target level terrain elevation data blocks and the target texture data sub-packets, namely the generating resources which need to be scheduled by the user; the method for processing the target level terrain elevation data block and the target texture data sub-packet corresponding to the terrain sub-region comprises the following steps: according to the resolution characteristic value level of the user processing the topographic subarea
Figure SMS_124
Obtaining the resolution corresponding to the processed topography subareasLevel->
Figure SMS_125
Is recorded as a target level terrain elevation data block for processing the terrain sub-area according to the texture detail characteristic value level of the user for processing the terrain sub-area>
Figure SMS_126
Setting the utilization rate of texture data sub-packets corresponding to the processing topography sub-regions as +.>
Figure SMS_127
Denoted as target texture data sub-packets that process the terrain sub-area.
The hardware resource allocation module is used for calculating the dispatching resource duty ratio and the calculation resource duty ratio of the user and allocating the computer hardware resources.
The method specifically comprises the following steps:
according to the relative scheduling resource and the relative computing resource of the user, the scheduling resource duty ratio and the computing resource duty ratio of the user are calculated, specifically:
Figure SMS_128
Figure SMS_129
in the method, in the process of the invention,
Figure SMS_131
representing the scheduled resource duty cycle of the ith user, < > j->
Figure SMS_133
Indicating the relative scheduling resource of the ith user, < +.>
Figure SMS_135
Indicating the number of users simultaneously using the flight simulation device, < >>
Figure SMS_132
Representing the relative scheduling resources of the kth user; />
Figure SMS_134
Representing the computing resource duty cycle of the ith user, < ->
Figure SMS_136
Representing the relative computing resources of the ith user, < +.>
Figure SMS_137
Indicating the number of users simultaneously using the flight simulation device, < >>
Figure SMS_130
Representing the relative computing resources of the kth user.
The larger the relative scheduling resource and the relative computing resource of the user, the larger the scheduling resource proportion and the computing resource proportion of the user, the more the scheduling resource and the computing resource are needed by the user, the larger the corresponding needed computing power, and the more the computer hardware resources are allocated to the user.
And allocating corresponding computer hardware resources to each user according to the dispatching resource proportion and the computing resource proportion of all the users, and generating resources for dispatching the users to generate VR display images of the users.
The system comprises a terrain area blocking module, an area resource dividing module, an area characteristic acquisition module, an area characteristic grading module, a user resource characteristic acquisition module, a user resource acquisition module and a hardware resource allocation module. Compared with the existing single-user flight simulation experience system, the multi-user flight simulation experience system provided by the invention acquires the vision range area and the projection plane of the user according to the viewpoint and the main vision direction of the user, acquires the resolution characteristic value and the texture detail characteristic value of each terrain subarea in the vision range area, calculates the relative scheduling resource and the relative computing resource of the user according to the terrain elevation data matrix, distributes the computer hardware resource according to the scheduling resource occupation ratio and the computing resource occupation ratio of the user, schedules the generation resource of the user to generate the VR display picture of the user, better and reasonably distributes the computer hardware resource under the multi-user scene, and ensures the picture quality of each user VR display device under the multi-user scene.
The foregoing description of the preferred embodiments of the invention is not intended to be limiting, but rather is intended to cover all modifications, equivalents, alternatives, and improvements that fall within the spirit and scope of the invention.

Claims (1)

1. A VR image based flight simulation experience system, the system comprising:
the terrain area blocking module is used for blocking the terrain area to obtain all terrain areas;
the regional resource dividing module divides the texture data packet to obtain texture data sub-packets of each terrain sub-region, sets different resolution levels and obtains terrain elevation data blocks of all resolution levels of each terrain sub-region;
the regional characteristic acquisition module acquires a visual range region and a projection plane of a user according to the viewpoint and the main viewing direction of the user, and acquires a resolution characteristic value and a texture detail characteristic value of each terrain subarea in the visual range region;
the regional characteristic grading module is used for grading the resolution characteristic values and the texture detail characteristic values to obtain all resolution characteristic value levels and texture detail characteristic value levels;
the user resource characteristic acquisition module acquires all the terrain sub-areas and the terrain elevation data matrixes of the user, and calculates the relative scheduling resources and the relative calculation resources of the user according to the terrain elevation data matrixes;
the user resource acquisition module takes the target level terrain elevation data blocks and the target texture data sub-packets corresponding to all the processing terrain sub-areas of the user as the generation resources of the user;
the hardware resource allocation module is used for calculating the dispatching resource proportion and the computing resource proportion of the users, allocating computer hardware resources according to the dispatching resource proportion and the computing resource proportion of the users, and generating resources for dispatching the users to generate VR display pictures of the users;
the setting of different resolution levels comprises the following specific steps:
dividing the highest resolution level into the lowest resolution level for multi-resolution level division into a second preset number of resolution levels, wherein the resolution levels
Figure QLYQS_3
The resolution level 1 is the lowest resolution level; the number of data points in the terrain elevation data block at resolution level 1 is +.>
Figure QLYQS_5
Within the range, the number of data points in the terrain elevation data block at resolution level 2 is
Figure QLYQS_8
Within the scope, resolution level +.>
Figure QLYQS_2
The number of data points in the terrain elevation data block of +.>
Figure QLYQS_4
Within the range of>
Figure QLYQS_6
Representing a second preset number,/->
Figure QLYQS_7
The number of data points in the terrain elevation data block representing the lowest resolution level, +.>
Figure QLYQS_1
A number of data points in the terrain elevation data block representing the highest high resolution level;
the method for acquiring the visual range area and the projection plane of the user according to the viewpoint and the main viewing direction of the user comprises the following specific steps:
when a plurality of users use the flight simulation equipment at the same time, the viewpoints of the plurality of users are the same viewpoint, and the main viewing directions of the plurality of users are a plurality of;
acquiring a visual range area on a terrain area according to the viewpoint by combining the view angle limitation of the flight simulation equipment; combining VR equipment information, and acquiring a projection plane of each user according to the main viewing direction of each user;
the method for acquiring the resolution characteristic value and the texture detail characteristic value comprises the following steps:
acquiring the distance between the center point of each terrain subarea and the viewpoint in the visual range area, and recording the distance as the viewpoint distance of each terrain subarea; acquiring the distance between the central point of each terrain subarea and the central point of the projection plane of each user, and recording the distance as the main viewing distance between each terrain subarea and each user;
the method comprises the steps of taking a result of dividing a viewpoint distance of a terrain subarea by a first numerical value as a resolution characteristic value of the terrain subarea, taking a result of dividing a main viewing distance of the terrain subarea and a user by a second numerical value as a texture detail characteristic value of the terrain subarea and the user, wherein the second numerical value is a maximum distance from a projection plane of the user to a center point of the projection plane;
the method for obtaining all the resolution characteristic value levels and the texture detail characteristic value levels comprises the following specific steps:
the value range according to the resolution characteristic value is [0,1]All resolution eigenvalue levels are obtained, including: the resolution characteristic value is in
Figure QLYQS_9
When the range is within the range, the resolution characteristic value is in the range of 1, and the resolution characteristic value is in the range of +.>
Figure QLYQS_10
In the range, the resolution characteristic value is in the range of 2, and the resolution characteristic value is in the range of +.>
Figure QLYQS_11
Within the range, belonging to the resolution eigenvalue class +.>
Figure QLYQS_12
,/>
Figure QLYQS_13
Representing a second preset number;
according to the value range [0,1 ] of the characteristic value of the texture detail]Obtaining all texture detail feature value levels, including: texture detail feature value is at
Figure QLYQS_14
When the texture detail feature value is within the range, the texture detail feature value is in the range of +.>
Figure QLYQS_15
In the range, the texture detail characteristic value is in the level 2, and similarly, the texture detail characteristic value is in the range +.>
Figure QLYQS_16
Within the range, belonging to the texture detail eigenvalue level +.>
Figure QLYQS_17
,/>
Figure QLYQS_18
Representing a third preset number;
the method for acquiring all the processing terrain subareas and the terrain elevation data matrixes of the user comprises the following specific steps:
for any terrain subarea, acquiring the lowest level of the texture detail characteristic value levels of the terrain subarea and a plurality of users, and marking the user corresponding to the lowest level of the texture detail characteristic value level as a processing user of the terrain subarea; acquiring processing users corresponding to each topographic subarea in the visual range area, and distributing each topographic subarea in the visual range area to the corresponding processing user as the processing topographic subarea of the processing user;
for any one user, all processing topography areas of the user are countedIn the domain, belonging to the resolution eigenvalue class
Figure QLYQS_19
And belongs to the texture detail eigenvalue level +.>
Figure QLYQS_20
The number of the terrain subareas is used as the first +.>
Figure QLYQS_21
Line, th->
Figure QLYQS_22
Elements of a column;
the calculation method of the relative scheduling resource comprises the following steps:
Figure QLYQS_25
in (1) the->
Figure QLYQS_28
Indicating the relative scheduling resource of the ith user, < +.>
Figure QLYQS_30
Representing the number of processing terrain areas belonging to the ith user,/->
Figure QLYQS_24
The (th) in the terrain elevation data matrix representing the (th) user>
Figure QLYQS_27
Line, th->
Figure QLYQS_29
Column element->
Figure QLYQS_31
Represents the natural logarithm based on the natural constant e, < ->
Figure QLYQS_23
Representing a second preset number,/->
Figure QLYQS_26
Representing a third preset number;
the computing method of the relative computing resource comprises the following steps:
Figure QLYQS_33
in (1) the->
Figure QLYQS_37
Representing the relative computing resources of the ith user, < +.>
Figure QLYQS_39
Representing the number of processing terrain areas belonging to the ith user,/->
Figure QLYQS_34
The (th) in the terrain elevation data matrix representing the (th) user>
Figure QLYQS_36
Line, th->
Figure QLYQS_38
Column element->
Figure QLYQS_40
Represents the natural logarithm based on the natural constant e, < ->
Figure QLYQS_32
Representing a second preset number,/->
Figure QLYQS_35
Representing a third preset number;
the method for acquiring the target level terrain elevation data block and the target texture data sub-packet comprises the following steps:
according to the distribution of the user's processing topography sub-areasLevel of resolution feature value
Figure QLYQS_41
Obtaining a resolution level corresponding to the processing topography sub-area>
Figure QLYQS_42
Is recorded as a target level terrain elevation data block for processing the terrain sub-area according to the texture detail characteristic value level of the user for processing the terrain sub-area>
Figure QLYQS_43
Setting the utilization rate of texture data sub-packets corresponding to the processing topography sub-regions as +.>
Figure QLYQS_44
Target texture data sub-packet, denoted as processing terrain sub-area,>
Figure QLYQS_45
representing a third preset number;
the method comprises the following specific steps of:
Figure QLYQS_47
Figure QLYQS_51
in (1) the->
Figure QLYQS_54
Representing the scheduled resource duty cycle of the ith user, < > j->
Figure QLYQS_46
Indicating the relative scheduling resource of the ith user, < +.>
Figure QLYQS_49
Representing simultaneous use of flight simulation equipmentNumber of spare users->
Figure QLYQS_52
Representing the relative scheduling resources of the kth user; />
Figure QLYQS_55
Representing the computing resource duty cycle of the ith user, < ->
Figure QLYQS_48
Representing the relative computing resources of the ith user, < +.>
Figure QLYQS_50
Indicating the number of users simultaneously using the flight simulation device, < >>
Figure QLYQS_53
Representing the relative computing resources of the kth user. />
CN202310213585.4A 2023-03-08 2023-03-08 Flight simulation experience system based on VR image Active CN115909858B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310213585.4A CN115909858B (en) 2023-03-08 2023-03-08 Flight simulation experience system based on VR image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310213585.4A CN115909858B (en) 2023-03-08 2023-03-08 Flight simulation experience system based on VR image

Publications (2)

Publication Number Publication Date
CN115909858A CN115909858A (en) 2023-04-04
CN115909858B true CN115909858B (en) 2023-05-09

Family

ID=85739227

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310213585.4A Active CN115909858B (en) 2023-03-08 2023-03-08 Flight simulation experience system based on VR image

Country Status (1)

Country Link
CN (1) CN115909858B (en)

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0100097B1 (en) * 1982-07-30 1991-01-30 Honeywell Inc. Computer controlled imaging system
US6346938B1 (en) * 1999-04-27 2002-02-12 Harris Corporation Computer-resident mechanism for manipulating, navigating through and mensurating displayed image of three-dimensional geometric model
WO2006137829A2 (en) * 2004-08-10 2006-12-28 Sarnoff Corporation Method and system for performing adaptive image acquisition
US8229163B2 (en) * 2007-08-22 2012-07-24 American Gnc Corporation 4D GIS based virtual reality for moving target prediction
CN102074049A (en) * 2011-03-01 2011-05-25 哈尔滨工程大学 Wide-range terrain scheduling simplifying method based on movement of viewpoint
CN202221566U (en) * 2011-07-08 2012-05-16 中国民航科学技术研究院 Flight programming system and verification platform of performance-based navigation
CN104766366B (en) * 2015-03-31 2019-02-19 东北林业大学 A kind of method for building up of three-dimension virtual reality demonstration
CN105139451B (en) * 2015-08-10 2018-06-26 中国商用飞机有限责任公司北京民用飞机技术研究中心 A kind of Synthetic vision based on HUD guides display system
CN106446351A (en) * 2016-08-31 2017-02-22 郑州捷安高科股份有限公司 Real-time drawing-oriented large-scale scene organization and scheduling technology and simulation system
CN106530896A (en) * 2016-11-30 2017-03-22 中国直升机设计研究所 Virtual system for unmanned aerial vehicle flight demonstration
CN109064546A (en) * 2018-06-08 2018-12-21 东南大学 A kind of landform image data fast dispatch method and its system
CN110908510B (en) * 2019-11-08 2022-09-02 四川大学 Application method of oblique photography modeling data in immersive display equipment
WO2021113268A1 (en) * 2019-12-01 2021-06-10 Iven Connary Systems and methods for generating of 3d information on a user display from processing of sensor data
US11935288B2 (en) * 2019-12-01 2024-03-19 Pointivo Inc. Systems and methods for generating of 3D information on a user display from processing of sensor data for objects, components or features of interest in a scene and user navigation thereon
US11216663B1 (en) * 2020-12-01 2022-01-04 Pointivo, Inc. Systems and methods for generating of 3D information on a user display from processing of sensor data for objects, components or features of interest in a scene and user navigation thereon
CN112001993A (en) * 2020-07-14 2020-11-27 深圳市规划国土房产信息中心(深圳市空间地理信息中心) Multi-GPU (graphics processing Unit) city simulation system for large scene
CN113506370B (en) * 2021-07-28 2022-08-16 自然资源部国土卫星遥感应用中心 Three-dimensional geographic scene model construction method and device based on three-dimensional remote sensing image
CN113516769B (en) * 2021-07-28 2023-04-21 自然资源部国土卫星遥感应用中心 Virtual reality three-dimensional scene loading and rendering method and device and terminal equipment

Also Published As

Publication number Publication date
CN115909858A (en) 2023-04-04

Similar Documents

Publication Publication Date Title
US11538229B2 (en) Image processing method and apparatus, electronic device, and computer-readable storage medium
CN102741879B (en) Method for generating depth maps from monocular images and systems using the same
CN102306395B (en) Distributed drawing method and device of three-dimensional data
US4855934A (en) System for texturing computer graphics images
CN108267154B (en) Map display method and device
CN107193372A (en) From multiple optional position rectangle planes to the projecting method of variable projection centre
CN104731894A (en) Thermodynamic diagram display method and device
CN110827391B (en) Image rendering method, device and equipment and storage medium
CN102447925A (en) Method and device for synthesizing virtual viewpoint image
US9208752B2 (en) Method for synchronous representation of a virtual reality in a distributed simulation system
CN110349261B (en) Method for generating three-dimensional thermodynamic diagram based on GIS
WO2022011915A1 (en) Naked-eye 3d display method and apparatus based on multiple layers of transparent liquid crystal screens
CN112055213B (en) Method, system and medium for generating compressed image
CN110555085A (en) Three-dimensional model loading method and device
CN107274344B (en) Map zooming method and system based on resource distribution, memory and control equipment
CN115909858B (en) Flight simulation experience system based on VR image
CN116363290A (en) Texture map generation method for large-scale scene three-dimensional reconstruction
CN107688431A (en) Man-machine interaction method based on radar fix
JPWO2015186284A1 (en) Image processing apparatus, image processing method, and program
CN116883576A (en) TBR+PT-based collaborative rendering method and device
CN103106687B (en) The computer generating method of three-dimensional ocean grid and device thereof in self-adaptation FOV (Field of View)
Yin et al. Application of virtual reality in marine search and rescue simulator.
CN109741465B (en) Image processing method and device and display device
CN112507766B (en) Face image extraction method, storage medium and terminal equipment
Mueller The sort-first architecture for real-time image generation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant