CN115909858B - Flight simulation experience system based on VR image - Google Patents
Flight simulation experience system based on VR image Download PDFInfo
- Publication number
- CN115909858B CN115909858B CN202310213585.4A CN202310213585A CN115909858B CN 115909858 B CN115909858 B CN 115909858B CN 202310213585 A CN202310213585 A CN 202310213585A CN 115909858 B CN115909858 B CN 115909858B
- Authority
- CN
- China
- Prior art keywords
- user
- terrain
- resolution
- characteristic value
- representing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Abstract
The invention relates to the technical field of data processing, in particular to a flight simulation experience system based on VR images, which is used for obtaining all terrain subareas, obtaining a visual range area and a projection plane of a user according to the viewpoint and the main viewing direction of the user, obtaining a resolution characteristic value and a texture detail characteristic value of each terrain subarea in the visual range area, obtaining all terrain subareas and terrain elevation data matrixes of the user, calculating relative scheduling resources and relative computing resources of the user according to the terrain elevation data matrixes, distributing computer hardware resources according to the scheduling resource occupation ratio and the computing resource occupation ratio of the user, scheduling generation resources of the user to generate VR display images of the user, better reasonably distributing the computer hardware resources in a multi-user scene, and guaranteeing the image quality of VR display equipment of each user in the multi-user scene.
Description
Technical Field
The invention relates to the technical field of data processing, in particular to a flight simulation experience system based on VR images.
Background
The flight simulation experience system is simulation equipment for flight training and education science popularization, and the most main application of the flight simulation experience system is training of pilots, and the flight simulation experience system is widely applied in the popularization background of civil aviation in recent years due to the advantages of safety, reliability, economy and no limitation of weather conditions. With the continuous development of VR technology, combining VR technology with the flight simulation experience system further improves the immersive performance and interactivity of the flight simulation system, so that the flight scene is more real, but with the continuous development of VR technology, higher requirements are put forward on the performance of computer-generated graphics.
The flight simulation experience system needs to rapidly process huge terrain data and texture map data to realize continuity of flight simulation pictures and avoid picture tearing, but as the requirements of users on picture quality of the flight simulation system are gradually improved at the same time, the hardware cost of a computer is also larger and larger, and how to better coordinate hardware resources of the computer under a multi-user scene so as to achieve the aim of ensuring the picture quality of a user VR display end becomes an industry difficulty to be solved urgently. In the prior art, in order to save the cost of computer hardware, the terrain data and texture data required by the flight simulation picture are generally adjusted in real time according to the change of the flight viewpoint. Under a multi-user scene, different users have different main view directions, the prior art utilizes a terrain LOD model algorithm to carry out different levels of storage scheduling on the terrain data to achieve the aim of saving computer hardware resources, but the mode cannot meet the comprehensive scheduling of the terrain data under a multi-user target, and the picture quality of each user VR display device under the multi-user scene cannot be ensured, so that a method for comprehensively scheduling the terrain data through multiple main view directions to further ensure the picture quality of the multi-user VR display device is needed.
Disclosure of Invention
The invention provides a flight simulation experience system based on VR images, which aims to solve the existing problems.
The invention relates to a flight simulation experience system based on VR images, which adopts the following technical scheme:
one embodiment of the present invention provides a VR image-based flight simulation experience system, comprising:
the terrain area blocking module is used for blocking the terrain area to obtain all terrain areas;
the regional resource dividing module divides the texture data packet to obtain texture data sub-packets of each terrain sub-region, sets different resolution levels and obtains terrain elevation data blocks of all resolution levels of each terrain sub-region;
the regional characteristic acquisition module acquires a visual range region and a projection plane of a user according to the viewpoint and the main viewing direction of the user, and acquires a resolution characteristic value and a texture detail characteristic value of each terrain subarea in the visual range region;
the regional characteristic grading module is used for grading the resolution characteristic values and the texture detail characteristic values to obtain all resolution characteristic value levels and texture detail characteristic value levels;
the user resource characteristic acquisition module acquires all the terrain sub-areas and the terrain elevation data matrixes of the user, and calculates the relative scheduling resources and the relative calculation resources of the user according to the terrain elevation data matrixes;
the user resource acquisition module takes the target level terrain elevation data blocks and the target texture data sub-packets corresponding to all the processing terrain sub-areas of the user as the generation resources of the user;
and the hardware resource allocation module is used for calculating the dispatching resource duty ratio and the computing resource duty ratio of the user, allocating the computer hardware resource according to the dispatching resource duty ratio and the computing resource duty ratio of the user, and generating the generating resource of the dispatching user and generating the VR display picture of each user.
Further, the setting of different resolution levels includes the following specific steps:
dividing the highest resolution level into the lowest resolution level for multi-resolution level division into a second preset number of resolution levels, wherein the resolution levelsThe resolution level 1 is the lowest resolution level; the number of data points in the terrain elevation data block at resolution level 1 is +.>Within the range, the number of data points in the terrain elevation data block at resolution level 2 isWithin the scope, resolution level +.>The number of data points in the terrain elevation data block of +.>Within the range of>Representing a second presetQuantity of->The number of data points in the terrain elevation data block representing the lowest resolution level, +.>Representing the number of data points in the terrain elevation data block for the highest high resolution level.
Further, the method for acquiring the visual range area and the projection plane of the user according to the viewpoint and the main viewing direction of the user comprises the following specific steps:
when a plurality of users use the flight simulation equipment at the same time, the viewpoints of the plurality of users are the same viewpoint, and the main viewing directions of the plurality of users are a plurality of;
acquiring a visual range area on a terrain area according to the viewpoint by combining the view angle limitation of the flight simulation equipment; and combining the VR equipment information, and acquiring a projection plane of each user according to the main viewing direction of each user.
Further, the method for obtaining the resolution characteristic value and the texture detail characteristic value comprises the following steps:
acquiring the distance between the center point of each terrain subarea and the viewpoint in the visual range area, and recording the distance as the viewpoint distance of each terrain subarea; acquiring the distance between the central point of each terrain subarea and the central point of the projection plane of each user, and recording the distance as the main viewing distance between each terrain subarea and each user;
and dividing the viewpoint distance of the terrain subarea by a first value to obtain a resolution characteristic value of the terrain subarea, dividing the primary viewing distance of the terrain subarea and a user by a second value to obtain a texture detail characteristic value of the terrain subarea and the user, wherein the second value is the maximum distance from the projection plane of the user to the central point of the projection plane.
Further, the steps of obtaining all the resolution characteristic value levels and the texture detail characteristic value levels comprise the following specific steps:
the value range according to the resolution characteristic value is [0,1]Obtaining all resolution eigenvalue levelsComprising: the resolution characteristic value is inWhen the range is within the range, the resolution characteristic value is in the range of 1, and the resolution characteristic value is in the range of +.>In the range, the resolution characteristic value is in the range of 2, and the resolution characteristic value is in the range of +.>Within the range, belonging to the resolution eigenvalue class +.>,/>Representing a second preset number;
according to the value range [0,1 ] of the characteristic value of the texture detail]Obtaining all texture detail feature value levels, including: texture detail feature value is atWhen the texture detail feature value is in the range, the texture detail feature value belongs to the texture detail feature value level 1, and the texture detail feature value is in the rangeIn the range, the texture detail characteristic value is in the level 2, and similarly, the texture detail characteristic value is in the range +.>Within the range, belonging to the texture detail eigenvalue level +.>,/>Representing a third preset number. />
Further, the step of acquiring all the processing terrain sub-areas and the terrain elevation data matrixes of the user comprises the following specific steps:
for any terrain subarea, acquiring the lowest level of the texture detail characteristic value levels of the terrain subarea and a plurality of users, and marking the user corresponding to the lowest level of the texture detail characteristic value level as a processing user of the terrain subarea; acquiring processing users corresponding to each topographic subarea in the visual range area, and distributing each topographic subarea in the visual range area to the corresponding processing user as the processing topographic subarea of the processing user;
for any one user, counting all processing terrain areas of the user, wherein the processing terrain areas belong to the resolution characteristic value levelAnd belongs to the texture detail eigenvalue level +.>The number of the terrain subareas is used as the first +.>Line, th->Column elements.
Further, the calculation method of the relative scheduling resource comprises the following steps:
in the method, in the process of the invention,indicating the relative scheduling resource of the ith user, < +.>Representing the number of processing terrain areas belonging to the ith user,/->The (th) in the terrain elevation data matrix representing the (th) user>Line, th->Column element->Represents the natural logarithm based on the natural constant e, < ->Representing a second preset number,/->Representing a third preset number.
Further, the computing method of the relative computing resource comprises the following steps:
in the method, in the process of the invention,representing the relative computing resources of the ith user, < +.>Representing the number of processing terrain areas belonging to the ith user,/->The (th) in the terrain elevation data matrix representing the (th) user>Line, th->Column element->Represents the natural logarithm based on the natural constant e, < ->Representing a second preset number,/->Representing a third preset number.
Further, the method for acquiring the target level terrain elevation data block and the target texture data sub-packet comprises the following steps:
according to the resolution characteristic value level of the user processing the topographic subareaObtaining a resolution level corresponding to the processing topography sub-area>Is recorded as a target level terrain elevation data block for processing the terrain sub-area according to the texture detail characteristic value level of the user for processing the terrain sub-area>Setting the utilization rate of texture data sub-packets corresponding to the processing topography sub-regions as +.>Target texture data sub-packet, denoted as processing terrain sub-area,>representing a third preset number.
Further, the calculating the scheduling resource ratio and the calculating resource ratio of the user comprises the following specific steps:
in the method, in the process of the invention,representing the scheduled resource duty cycle of the ith user, < > j->Indicating the relative scheduling resource of the ith user, < +.>Indicating the number of users simultaneously using the flight simulation device, < >>Representing the relative scheduling resources of the kth user; />Representing the computing resource duty cycle of the ith user, < ->Representing the relative computing resources of the ith user, < +.>Indicating the number of users simultaneously using the flight simulation device, < >>Representing the relative computing resources of the kth user. />
The technical scheme of the invention has the beneficial effects that: compared with the existing single-user flight simulation experience system, the multi-user flight simulation experience system provided by the invention acquires the vision range area and the projection plane of the user according to the viewpoint and the main vision direction of the user, acquires the resolution characteristic value and the texture detail characteristic value of each terrain subarea in the vision range area, calculates the relative scheduling resource and the relative computing resource of the user according to the terrain elevation data matrix, distributes the computer hardware resource according to the scheduling resource occupation ratio and the computing resource occupation ratio of the user, schedules the generation resource of the user to generate the VR display picture of the user, better and reasonably distributes the computer hardware resource under the multi-user scene, and ensures the picture quality of each user VR display device under the multi-user scene.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of steps of a VR image-based flight simulation experience system of the present invention;
fig. 2 is a schematic view of the visual range area and projection plane of a user of the present invention.
Detailed Description
In order to further describe the technical means and effects adopted by the present invention to achieve the preset purpose, the following detailed description refers to specific implementation, structure, features and effects of a VR image-based flight simulation experience system according to the present invention with reference to the accompanying drawings and preferred embodiments. In the following description, different "one embodiment" or "another embodiment" means that the embodiments are not necessarily the same. Furthermore, the particular features, structures, or characteristics of one or more embodiments may be combined in any suitable manner.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
The following specifically describes a specific scheme of a flight simulation experience system based on VR images provided by the invention with reference to the accompanying drawings.
Referring to fig. 1, a flowchart illustrating steps of a VR image-based flight simulation experience system according to an embodiment of the present invention is shown, where the system includes: the system comprises a terrain area partitioning module, an area resource dividing module, an area characteristic obtaining module, an area characteristic grading module, a user resource characteristic obtaining module, a user resource obtaining module and a hardware resource distributing module.
The terrain area blocking module is used for blocking the terrain area to obtain all terrain areas.
The method specifically comprises the following steps:
partitioning a terrain area of a flight simulation scene to obtain all terrain areas, wherein the method comprises the following steps: the whole terrain area of the flight simulation scene is a square area, the whole terrain area is uniformly divided into a first preset number of terrain subareas (square areas), the areas of all the terrain subareas are uniformly divided, and the areas of the terrain subareas are uniformly marked as S, and the area units of the terrain subareas are square decimeters. A first predetermined number in this embodiment10000, in other embodiments, the practitioner may set the first preset number as desired.
The regional resource division module is used for dividing the DEM terrain elevation data and the texture data packet according to all the terrain subareas.
The method specifically comprises the following steps:
setting different resolution levels, including: the DEM terrain elevation data of the terrain area are used for drawing the terrain of the terrain area, the DEM terrain elevation data corresponding to each terrain subarea are recorded as terrain elevation data blocks, the number of data points in the terrain elevation data blocks of different resolution levels of the terrain subarea is different, and the number of data points in the terrain elevation data blocks of higher resolution levels is larger. The highest resolution level of the terrain elevation data block is determined by the source of the DEM terrain elevation data, while the lowest resolution level of the terrain elevation data block is set by the user himself.
In the embodiment of the invention, the number of data points in the terrain elevation data block with the highest resolution level isWherein S represents the area of the terrain sub-area, < ->Representing a first unit area, the first unit area being 0.25 square meters; the number of data points in the terrain elevation data block of the lowest resolution level is +.>Wherein S represents the area of the terrain sub-area, < ->Representing a second unit area, in this embodiment 100 square meters, and in other embodiments, the practitioner may set the second unit area as desired.
Dividing the highest resolution level into the lowest resolution level for multi-resolution level division into a second preset number of resolution levels, wherein the resolution levelsThe resolution level 1 is the lowest resolution level; the number of data points in the terrain elevation data block at resolution level 1 is +.>Within the range, the number of data points in the terrain elevation data block at resolution level 2 isWithin the scope, resolution level +.>The number of data points in the terrain elevation data block of +.>Within the range of>Representing a second preset number,/->The number of data points in the terrain elevation data block representing the lowest resolution level, +.>Terrain elevation data block representing highest resolution levelNumber of data points.
A second predetermined number in this embodiment50, in other embodiments, the practitioner may set a second predetermined number as desired.
Storing terrain elevation data blocks of different resolution levels for the terrain sub-area according to a terrain LOD model algorithm, comprising: for any one terrain subarea, obtaining terrain elevation data blocks of all resolution levels of the terrain subarea, and storing the terrain elevation data blocks of each resolution level of the terrain subarea in corresponding units; the terrain elevation data blocks for all resolution levels of all terrain sub-areas are stored in corresponding cells.
The terrain area of the flight simulation scene corresponds to a texture data packet, and the texture data packet is used for rendering the terrain area after the terrain of the terrain area is drawn, and drawing specific texture details; according to the block result of the terrain area, the texture data packet is divided into a plurality of texture data sub-packets, each terrain sub-packet corresponds to one texture data sub-packet, the texture data sub-packet is used for rendering the terrain sub-packet, and the higher the utilization rate of the texture data sub-packet is, the higher the rendering degree of the terrain sub-packet is, the more the texture details of the terrain sub-packet are, and the better the visual effect is.
The regional characteristic acquisition module is used for acquiring the resolution characteristic value and the texture detail characteristic value of the terrain subarea.
The method specifically comprises the following steps:
when a plurality of users use the flight simulation equipment at the same time, the viewpoints and the main viewing directions of the plurality of users are obtained through the positioning device arranged in the flight simulation equipment, and because the viewpoints of different users on the same flight simulation equipment are approximately the same and only the main viewing directions are different, the viewpoints of the plurality of users are considered to be the same viewpoint, and the main viewing directions of the plurality of users are multiple.
The visual range area on the terrain area is obtained according to the viewpoint in combination with the view angle limitation of the flight simulation device, as shown in fig. 2, which is the prior art, and will not be described in detail here; and combining the VR equipment information, and acquiring a projection plane of each user according to the main viewing direction of each user, as shown in figure 2.
Acquiring the distance between the center point of each terrain subarea and the viewpoint in the visual range area, and recording the distance as the viewpoint distance of each terrain subarea; and acquiring the distance between the central point of each terrain subarea and the central point of the projection plane of each user, and recording the distance as the main viewing distance between each terrain subarea and each user.
The requirements of the user on the resolution level and texture details of each terrain subarea in the visual range area are related to the viewpoint distance of each terrain subarea and the main viewing distance of each terrain subarea and the user: the larger the viewpoint distance of the terrain subarea is, the farther the terrain subarea is from the viewpoint, the smaller the requirement of the user on the resolution of the terrain subarea is, namely, the larger the viewpoint distance is, the terrain elevation data blocks with lower resolution level of the terrain subarea are required to be used for forming the simulated terrain (under the main view direction) of the user; the smaller the viewpoint distance is, the closer the terrain subarea is to the viewpoint, the larger the requirement of the user on the resolution of the terrain elevation data block of the terrain subarea is, namely, the smaller the viewpoint distance is, the terrain elevation data block with the higher resolution level of the terrain subarea is required to be used for forming the simulated terrain (in the main view direction) of the user.
It should be further noted that, the larger the main viewing distance between the topographic subregion and the user, the farther the topographic subregion is from the projection plane of the user, the smaller the requirement of the user on the texture details of the topographic subregion (determined by the human eye imaging characteristics), the lower the usage rate of the texture data sub-packet, that is, the larger the main viewing distance, the fewer the texture details of the topographic subregion are required, and the lower the usage rate of the texture data sub-packet is; the smaller the main viewing distance is, the closer the topographic subarea is to the projection plane of the user, the larger the requirement of the user on the texture details (determined by the human eye imaging characteristics) of the topographic subarea is, namely the smaller the main viewing distance is, the more the texture details of the topographic subarea are required, and the higher the usage rate of the texture data subpacket is.
In summary, the requirements of the user on the resolution level and texture detail of the terrain subarea are respectively represented by the viewpoint distance of the terrain subarea and the main viewing distance of the terrain subarea and the user; the method comprises the steps that a result of dividing a viewpoint distance of a terrain subarea by a first numerical value is used as a resolution characteristic value of the terrain subarea, the first numerical value is a visual distance set by a flight simulation system, and the visual distance is determined by setting parameters of the flight simulation system; and dividing the primary viewing distance between the topographic subregion and the user by a second numerical value which is the maximum distance between the topographic subregion and the center point of the projection plane on the projection plane of the user as the characteristic value of the texture detail between the topographic subregion and the user. The resolution characteristic value and the texture detail characteristic value are normalized results.
The regional characteristic grading module is used for grading the resolution characteristic value and the texture detail characteristic value of the terrain subarea.
The method specifically comprises the following steps:
the larger the viewpoint distance of the topographic subarea is, the larger the resolution characteristic value of the topographic subarea is, and the smaller the requirement of the user on the resolution of the topographic subarea is; therefore, the larger the resolution characteristic value of the terrain subarea is, the smaller the resolution characteristic value level of the terrain subarea is, and at this time, the smaller the requirement of the user on the resolution of the terrain subarea is. Classifying the resolution characteristic values and the texture detail characteristic values, wherein the resolution characteristic values are divided into a second preset number of resolution characteristic value levels, and the value range of the resolution characteristic values is [0,1 ] as the resolution characteristic values are normalized results]All resolution eigenvalue levels are specifically: the resolution characteristic value is inWhen the range is within, the resolution characteristic value is in the resolution characteristic value level 1, and the resolution characteristic value is in the resolution characteristic value rangeIn the range, the resolution characteristic value is in the range of 2, and the resolution characteristic value is in the range of +.>Within the range, belong to the resolution specialSign value grade->,/>Representing a second preset number, resolution characteristic value level +.>For the highest resolution eigenvalue level, resolution eigenvalue level 1 is the lowest resolution eigenvalue level.
The larger the main viewing distance between the topographic subarea and the user is, the larger the characteristic values of the topographic subarea and the texture details of the user are, and the smaller the requirement of the user on the texture details of the topographic subarea is; therefore, the larger the texture detail characteristic values of the terrain subarea and the user are, the smaller the texture detail characteristic value level of the terrain subarea is, and at the moment, the smaller the requirement of the user on the texture detail of the terrain subarea is. Classifying the texture detail characteristic values, wherein the texture detail characteristic values are divided into a third preset number of texture detail characteristic value levels, and the range of the texture detail characteristic values is [0,1 ] because the texture detail characteristic values are normalized results]All texture detail feature value levels are specifically: texture detail feature value is atWhen the texture detail feature value is within the range, the texture detail feature value is in the range of +.>In the range, the texture detail characteristic value is in the level 2, and similarly, the texture detail characteristic value is in the range +.>Within the range, belonging to the texture detail eigenvalue level +.>,/>Representing a third preset numberTexture detail eigenvalue level->For the highest texture detail feature value level, texture detail feature value level 1 is the lowest texture detail feature value level.
Third preset quantity in this embodiment10, in other embodiments, the practitioner may set a third predetermined number as desired.
The user resource characteristic acquisition module is used for the relative scheduling resources and the relative computing resources of the user.
The method specifically comprises the following steps:
it should be noted that, for the user, each terrain subarea in the visual range area of the user has a resolution characteristic value level and a texture detail characteristic value level, but not the terrain elevation data block corresponding to each terrain subarea is scheduled by the computing device of the user, so as to be used for forming the simulated terrain of the user; that is, for the user, it is the final objective to obtain accurate scheduling and texture drawing of each terrain sub-area within the visual range area of the user. Based on the logic, for any terrain subarea, the terrain subarea and each user have texture detail characteristic values, the terrain subarea and each user have texture detail characteristic value levels, and the terrain elevation data block corresponding to the terrain subarea should be called by the user with the lowest texture detail characteristic value level of the terrain subarea and used for forming the simulated terrain of the user, otherwise, the terrain elevation data block corresponding to the terrain subarea is important for the user.
For any terrain subarea, acquiring the lowest level of the texture detail characteristic value levels of the terrain subarea and a plurality of users, and marking the user corresponding to the lowest level of the texture detail characteristic value level as a processing user of the terrain subarea; each of the topographical sub-areas within the visual range area has a corresponding processing user, and each of the topographical sub-areas within the visual range area is assigned to the processing user as a processing topographical sub-area for the processing user.
For any one user, counting all processing terrain areas of the user, wherein the processing terrain areas belong to the resolution characteristic value levelAnd belongs to the texture detail eigenvalue level +.>The number of treated terrain areas>Quantity->First ∈h in the terrain elevation data matrix as user>Line, th->Elements of columns, where->,/>Is common->Individual resolution feature value levels, in common +.>The level of texture detail feature values, therefore the user's terrain elevation data matrix is of the size,/>Representing a second preset number,/->Representing a third preset number.
The calculation formula of the relative scheduling resource of the user is specifically as follows:
in the method, in the process of the invention,indicating the relative scheduling resource of the ith user, < +.>Representing the number of processing terrain areas belonging to the ith user,/->The (th) in the terrain elevation data matrix representing the (th) user>Line, th->The elements of the column (i.e. belonging to the resolution eigenvalue class +.>And belongs to the texture detail eigenvalue level +.>The number of treated terrain areas) of (1) a>Represents the natural logarithm based on the natural constant e, < ->Representing a second preset number,/->Representing a third preset number; processing terrain sub-areas with feature value levels belonging to different resolutionsEntropy of the number of (i) representing the relative scheduling resource of the i-th user,/the relative scheduling resource of the i-th user>The larger the resolution characteristic value level of the sub-region of the processing terrain of the ith user is, the more the resources need to be scheduled when the simulation terrain of the ith user is formed.
The calculation formula of the relative calculation resource of the user is specifically:
in the method, in the process of the invention,representing the relative computing resources of the ith user, < +.>Representing the number of processing terrain areas belonging to the ith user,/->The (th) in the terrain elevation data matrix representing the (th) user>Line, th->The elements of the column (i.e. belonging to the resolution eigenvalue class +.>And belongs to the texture detail eigenvalue level +.>The number of treated terrain areas) of (1) a>Represents the natural logarithm based on the natural constant e, < ->Representing a second preset number,/->Representing a third preset number; />All processing terrain areas representing the ith user belonging to the texture detail feature value level +.>The higher the level of texture detail feature value of the processed terrain sub-area, the more +.>The larger the requirement of the user on the texture details of the terrain subareas is, the higher the utilization rate of the texture data sub-packets is, the calculation force required for processing the terrain elevation data blocks corresponding to the terrain subareas is +.>The larger the required computing resources are, the more the relative computing resources of the user are +>The larger.
The user resource acquisition module is used for acquiring the generated resources of the user.
The method specifically comprises the following steps:
for any user, all the processing terrain subareas belonging to the user correspond to the target level terrain elevation data blocks and the target texture data sub-packets, namely the generating resources which need to be scheduled by the user; the method for processing the target level terrain elevation data block and the target texture data sub-packet corresponding to the terrain sub-region comprises the following steps: according to the resolution characteristic value level of the user processing the topographic subareaObtaining the resolution corresponding to the processed topography subareasLevel->Is recorded as a target level terrain elevation data block for processing the terrain sub-area according to the texture detail characteristic value level of the user for processing the terrain sub-area>Setting the utilization rate of texture data sub-packets corresponding to the processing topography sub-regions as +.>Denoted as target texture data sub-packets that process the terrain sub-area.
The hardware resource allocation module is used for calculating the dispatching resource duty ratio and the calculation resource duty ratio of the user and allocating the computer hardware resources.
The method specifically comprises the following steps:
according to the relative scheduling resource and the relative computing resource of the user, the scheduling resource duty ratio and the computing resource duty ratio of the user are calculated, specifically:
in the method, in the process of the invention,representing the scheduled resource duty cycle of the ith user, < > j->Indicating the relative scheduling resource of the ith user, < +.>Indicating the number of users simultaneously using the flight simulation device, < >>Representing the relative scheduling resources of the kth user; />Representing the computing resource duty cycle of the ith user, < ->Representing the relative computing resources of the ith user, < +.>Indicating the number of users simultaneously using the flight simulation device, < >>Representing the relative computing resources of the kth user.
The larger the relative scheduling resource and the relative computing resource of the user, the larger the scheduling resource proportion and the computing resource proportion of the user, the more the scheduling resource and the computing resource are needed by the user, the larger the corresponding needed computing power, and the more the computer hardware resources are allocated to the user.
And allocating corresponding computer hardware resources to each user according to the dispatching resource proportion and the computing resource proportion of all the users, and generating resources for dispatching the users to generate VR display images of the users.
The system comprises a terrain area blocking module, an area resource dividing module, an area characteristic acquisition module, an area characteristic grading module, a user resource characteristic acquisition module, a user resource acquisition module and a hardware resource allocation module. Compared with the existing single-user flight simulation experience system, the multi-user flight simulation experience system provided by the invention acquires the vision range area and the projection plane of the user according to the viewpoint and the main vision direction of the user, acquires the resolution characteristic value and the texture detail characteristic value of each terrain subarea in the vision range area, calculates the relative scheduling resource and the relative computing resource of the user according to the terrain elevation data matrix, distributes the computer hardware resource according to the scheduling resource occupation ratio and the computing resource occupation ratio of the user, schedules the generation resource of the user to generate the VR display picture of the user, better and reasonably distributes the computer hardware resource under the multi-user scene, and ensures the picture quality of each user VR display device under the multi-user scene.
The foregoing description of the preferred embodiments of the invention is not intended to be limiting, but rather is intended to cover all modifications, equivalents, alternatives, and improvements that fall within the spirit and scope of the invention.
Claims (1)
1. A VR image based flight simulation experience system, the system comprising:
the terrain area blocking module is used for blocking the terrain area to obtain all terrain areas;
the regional resource dividing module divides the texture data packet to obtain texture data sub-packets of each terrain sub-region, sets different resolution levels and obtains terrain elevation data blocks of all resolution levels of each terrain sub-region;
the regional characteristic acquisition module acquires a visual range region and a projection plane of a user according to the viewpoint and the main viewing direction of the user, and acquires a resolution characteristic value and a texture detail characteristic value of each terrain subarea in the visual range region;
the regional characteristic grading module is used for grading the resolution characteristic values and the texture detail characteristic values to obtain all resolution characteristic value levels and texture detail characteristic value levels;
the user resource characteristic acquisition module acquires all the terrain sub-areas and the terrain elevation data matrixes of the user, and calculates the relative scheduling resources and the relative calculation resources of the user according to the terrain elevation data matrixes;
the user resource acquisition module takes the target level terrain elevation data blocks and the target texture data sub-packets corresponding to all the processing terrain sub-areas of the user as the generation resources of the user;
the hardware resource allocation module is used for calculating the dispatching resource proportion and the computing resource proportion of the users, allocating computer hardware resources according to the dispatching resource proportion and the computing resource proportion of the users, and generating resources for dispatching the users to generate VR display pictures of the users;
the setting of different resolution levels comprises the following specific steps:
dividing the highest resolution level into the lowest resolution level for multi-resolution level division into a second preset number of resolution levels, wherein the resolution levelsThe resolution level 1 is the lowest resolution level; the number of data points in the terrain elevation data block at resolution level 1 is +.>Within the range, the number of data points in the terrain elevation data block at resolution level 2 isWithin the scope, resolution level +.>The number of data points in the terrain elevation data block of +.>Within the range of>Representing a second preset number,/->The number of data points in the terrain elevation data block representing the lowest resolution level, +.>A number of data points in the terrain elevation data block representing the highest high resolution level;
the method for acquiring the visual range area and the projection plane of the user according to the viewpoint and the main viewing direction of the user comprises the following specific steps:
when a plurality of users use the flight simulation equipment at the same time, the viewpoints of the plurality of users are the same viewpoint, and the main viewing directions of the plurality of users are a plurality of;
acquiring a visual range area on a terrain area according to the viewpoint by combining the view angle limitation of the flight simulation equipment; combining VR equipment information, and acquiring a projection plane of each user according to the main viewing direction of each user;
the method for acquiring the resolution characteristic value and the texture detail characteristic value comprises the following steps:
acquiring the distance between the center point of each terrain subarea and the viewpoint in the visual range area, and recording the distance as the viewpoint distance of each terrain subarea; acquiring the distance between the central point of each terrain subarea and the central point of the projection plane of each user, and recording the distance as the main viewing distance between each terrain subarea and each user;
the method comprises the steps of taking a result of dividing a viewpoint distance of a terrain subarea by a first numerical value as a resolution characteristic value of the terrain subarea, taking a result of dividing a main viewing distance of the terrain subarea and a user by a second numerical value as a texture detail characteristic value of the terrain subarea and the user, wherein the second numerical value is a maximum distance from a projection plane of the user to a center point of the projection plane;
the method for obtaining all the resolution characteristic value levels and the texture detail characteristic value levels comprises the following specific steps:
the value range according to the resolution characteristic value is [0,1]All resolution eigenvalue levels are obtained, including: the resolution characteristic value is inWhen the range is within the range, the resolution characteristic value is in the range of 1, and the resolution characteristic value is in the range of +.>In the range, the resolution characteristic value is in the range of 2, and the resolution characteristic value is in the range of +.>Within the range, belonging to the resolution eigenvalue class +.>,/>Representing a second preset number;
according to the value range [0,1 ] of the characteristic value of the texture detail]Obtaining all texture detail feature value levels, including: texture detail feature value is atWhen the texture detail feature value is within the range, the texture detail feature value is in the range of +.>In the range, the texture detail characteristic value is in the level 2, and similarly, the texture detail characteristic value is in the range +.>Within the range, belonging to the texture detail eigenvalue level +.>,/>Representing a third preset number;
the method for acquiring all the processing terrain subareas and the terrain elevation data matrixes of the user comprises the following specific steps:
for any terrain subarea, acquiring the lowest level of the texture detail characteristic value levels of the terrain subarea and a plurality of users, and marking the user corresponding to the lowest level of the texture detail characteristic value level as a processing user of the terrain subarea; acquiring processing users corresponding to each topographic subarea in the visual range area, and distributing each topographic subarea in the visual range area to the corresponding processing user as the processing topographic subarea of the processing user;
for any one user, all processing topography areas of the user are countedIn the domain, belonging to the resolution eigenvalue classAnd belongs to the texture detail eigenvalue level +.>The number of the terrain subareas is used as the first +.>Line, th->Elements of a column;
the calculation method of the relative scheduling resource comprises the following steps:in (1) the->Indicating the relative scheduling resource of the ith user, < +.>Representing the number of processing terrain areas belonging to the ith user,/->The (th) in the terrain elevation data matrix representing the (th) user>Line, th->Column element->Represents the natural logarithm based on the natural constant e, < ->Representing a second preset number,/->Representing a third preset number;
the computing method of the relative computing resource comprises the following steps:in (1) the->Representing the relative computing resources of the ith user, < +.>Representing the number of processing terrain areas belonging to the ith user,/->The (th) in the terrain elevation data matrix representing the (th) user>Line, th->Column element->Represents the natural logarithm based on the natural constant e, < ->Representing a second preset number,/->Representing a third preset number;
the method for acquiring the target level terrain elevation data block and the target texture data sub-packet comprises the following steps:
according to the distribution of the user's processing topography sub-areasLevel of resolution feature valueObtaining a resolution level corresponding to the processing topography sub-area>Is recorded as a target level terrain elevation data block for processing the terrain sub-area according to the texture detail characteristic value level of the user for processing the terrain sub-area>Setting the utilization rate of texture data sub-packets corresponding to the processing topography sub-regions as +.>Target texture data sub-packet, denoted as processing terrain sub-area,>representing a third preset number;
the method comprises the following specific steps of:
in (1) the->Representing the scheduled resource duty cycle of the ith user, < > j->Indicating the relative scheduling resource of the ith user, < +.>Representing simultaneous use of flight simulation equipmentNumber of spare users->Representing the relative scheduling resources of the kth user; />Representing the computing resource duty cycle of the ith user, < ->Representing the relative computing resources of the ith user, < +.>Indicating the number of users simultaneously using the flight simulation device, < >>Representing the relative computing resources of the kth user. />
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310213585.4A CN115909858B (en) | 2023-03-08 | 2023-03-08 | Flight simulation experience system based on VR image |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310213585.4A CN115909858B (en) | 2023-03-08 | 2023-03-08 | Flight simulation experience system based on VR image |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115909858A CN115909858A (en) | 2023-04-04 |
CN115909858B true CN115909858B (en) | 2023-05-09 |
Family
ID=85739227
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310213585.4A Active CN115909858B (en) | 2023-03-08 | 2023-03-08 | Flight simulation experience system based on VR image |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115909858B (en) |
Family Cites Families (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0100097B1 (en) * | 1982-07-30 | 1991-01-30 | Honeywell Inc. | Computer controlled imaging system |
US6346938B1 (en) * | 1999-04-27 | 2002-02-12 | Harris Corporation | Computer-resident mechanism for manipulating, navigating through and mensurating displayed image of three-dimensional geometric model |
WO2006137829A2 (en) * | 2004-08-10 | 2006-12-28 | Sarnoff Corporation | Method and system for performing adaptive image acquisition |
US8229163B2 (en) * | 2007-08-22 | 2012-07-24 | American Gnc Corporation | 4D GIS based virtual reality for moving target prediction |
CN102074049A (en) * | 2011-03-01 | 2011-05-25 | 哈尔滨工程大学 | Wide-range terrain scheduling simplifying method based on movement of viewpoint |
CN202221566U (en) * | 2011-07-08 | 2012-05-16 | 中国民航科学技术研究院 | Flight programming system and verification platform of performance-based navigation |
CN104766366B (en) * | 2015-03-31 | 2019-02-19 | 东北林业大学 | A kind of method for building up of three-dimension virtual reality demonstration |
CN105139451B (en) * | 2015-08-10 | 2018-06-26 | 中国商用飞机有限责任公司北京民用飞机技术研究中心 | A kind of Synthetic vision based on HUD guides display system |
CN106446351A (en) * | 2016-08-31 | 2017-02-22 | 郑州捷安高科股份有限公司 | Real-time drawing-oriented large-scale scene organization and scheduling technology and simulation system |
CN106530896A (en) * | 2016-11-30 | 2017-03-22 | 中国直升机设计研究所 | Virtual system for unmanned aerial vehicle flight demonstration |
CN109064546A (en) * | 2018-06-08 | 2018-12-21 | 东南大学 | A kind of landform image data fast dispatch method and its system |
CN110908510B (en) * | 2019-11-08 | 2022-09-02 | 四川大学 | Application method of oblique photography modeling data in immersive display equipment |
WO2021113268A1 (en) * | 2019-12-01 | 2021-06-10 | Iven Connary | Systems and methods for generating of 3d information on a user display from processing of sensor data |
US11935288B2 (en) * | 2019-12-01 | 2024-03-19 | Pointivo Inc. | Systems and methods for generating of 3D information on a user display from processing of sensor data for objects, components or features of interest in a scene and user navigation thereon |
US11216663B1 (en) * | 2020-12-01 | 2022-01-04 | Pointivo, Inc. | Systems and methods for generating of 3D information on a user display from processing of sensor data for objects, components or features of interest in a scene and user navigation thereon |
CN112001993A (en) * | 2020-07-14 | 2020-11-27 | 深圳市规划国土房产信息中心(深圳市空间地理信息中心) | Multi-GPU (graphics processing Unit) city simulation system for large scene |
CN113506370B (en) * | 2021-07-28 | 2022-08-16 | 自然资源部国土卫星遥感应用中心 | Three-dimensional geographic scene model construction method and device based on three-dimensional remote sensing image |
CN113516769B (en) * | 2021-07-28 | 2023-04-21 | 自然资源部国土卫星遥感应用中心 | Virtual reality three-dimensional scene loading and rendering method and device and terminal equipment |
-
2023
- 2023-03-08 CN CN202310213585.4A patent/CN115909858B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN115909858A (en) | 2023-04-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11538229B2 (en) | Image processing method and apparatus, electronic device, and computer-readable storage medium | |
CN102741879B (en) | Method for generating depth maps from monocular images and systems using the same | |
CN102306395B (en) | Distributed drawing method and device of three-dimensional data | |
US4855934A (en) | System for texturing computer graphics images | |
CN108267154B (en) | Map display method and device | |
CN107193372A (en) | From multiple optional position rectangle planes to the projecting method of variable projection centre | |
CN104731894A (en) | Thermodynamic diagram display method and device | |
CN110827391B (en) | Image rendering method, device and equipment and storage medium | |
CN102447925A (en) | Method and device for synthesizing virtual viewpoint image | |
US9208752B2 (en) | Method for synchronous representation of a virtual reality in a distributed simulation system | |
CN110349261B (en) | Method for generating three-dimensional thermodynamic diagram based on GIS | |
WO2022011915A1 (en) | Naked-eye 3d display method and apparatus based on multiple layers of transparent liquid crystal screens | |
CN112055213B (en) | Method, system and medium for generating compressed image | |
CN110555085A (en) | Three-dimensional model loading method and device | |
CN107274344B (en) | Map zooming method and system based on resource distribution, memory and control equipment | |
CN115909858B (en) | Flight simulation experience system based on VR image | |
CN116363290A (en) | Texture map generation method for large-scale scene three-dimensional reconstruction | |
CN107688431A (en) | Man-machine interaction method based on radar fix | |
JPWO2015186284A1 (en) | Image processing apparatus, image processing method, and program | |
CN116883576A (en) | TBR+PT-based collaborative rendering method and device | |
CN103106687B (en) | The computer generating method of three-dimensional ocean grid and device thereof in self-adaptation FOV (Field of View) | |
Yin et al. | Application of virtual reality in marine search and rescue simulator. | |
CN109741465B (en) | Image processing method and device and display device | |
CN112507766B (en) | Face image extraction method, storage medium and terminal equipment | |
Mueller | The sort-first architecture for real-time image generation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |