CN113516744A - Virtual scene generation method, interface interaction method, commodity display method and equipment - Google Patents

Virtual scene generation method, interface interaction method, commodity display method and equipment Download PDF

Info

Publication number
CN113516744A
CN113516744A CN202010486656.4A CN202010486656A CN113516744A CN 113516744 A CN113516744 A CN 113516744A CN 202010486656 A CN202010486656 A CN 202010486656A CN 113516744 A CN113516744 A CN 113516744A
Authority
CN
China
Prior art keywords
simulation
scene
data information
virtual scene
strategy
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010486656.4A
Other languages
Chinese (zh)
Other versions
CN113516744B (en
Inventor
伏鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba Group Holding Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN202010486656.4A priority Critical patent/CN113516744B/en
Publication of CN113516744A publication Critical patent/CN113516744A/en
Application granted granted Critical
Publication of CN113516744B publication Critical patent/CN113516744B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0641Shopping interfaces
    • G06Q30/0643Graphical representation of items or shoppers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Computer Graphics (AREA)
  • Finance (AREA)
  • Computer Hardware Design (AREA)
  • Accounting & Taxation (AREA)
  • Architecture (AREA)
  • Software Systems (AREA)
  • Geometry (AREA)
  • Evolutionary Computation (AREA)
  • Human Computer Interaction (AREA)
  • Development Economics (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The embodiment of the application provides a method and equipment for virtual scene generation, interface interaction and commodity display. The method comprises the following steps: acquiring a first simulation strategy and a second simulation strategy corresponding to the virtual scene, wherein the virtual scene corresponds to a first change stage corresponding to the first simulation strategy and a second change stage corresponding to the second simulation strategy; executing a first simulation process with respect to the first variation phase based on the first simulation strategy; executing a second simulation process with respect to the second variation phase based on the second simulation strategy. The technical scheme provided by the embodiment of the application can improve the generation efficiency and the generation effect of the virtual scene.

Description

Virtual scene generation method, interface interaction method, commodity display method and equipment
Technical Field
The application relates to the technical field of computers, in particular to a method and equipment for generating a virtual scene, interacting interfaces and displaying commodities.
Background
The virtual scene is a virtual representation of a real scene in graphics, and is constructed by one or more elements of objects, lights, cameras and the like, wherein each object element records information of geometry, material, texture, placement position and the like, each light element records information of light type, light emitting parameters, light position and the like, and each camera element records information of camera position, orientation, visual angle, near view plane, far view plane and the like. On some games or training platforms, a virtual natural scene needs to be created.
The virtual natural scene generation not only relates to the geometry and material modeling of the whole terrain, but also comprises the geometry and material modeling of each element in the scene and the distribution condition of each element in the scene. Due to the inherent cognitive ability of human beings to the natural world, different features of various terrains can be quickly identified, the requirement on the reality of a natural scene generated by a computer is extremely high, and any detail errors and local discordance can be easily detected by human beings.
Disclosure of Invention
The application provides a method and equipment for generating a virtual scene, interacting an interface and displaying a commodity, so as to improve the generation efficiency and the generation effect of the virtual scene.
Thus, in one embodiment of the present application, a virtual scene generation method is provided. The method comprises the following steps:
acquiring a first simulation strategy and a second simulation strategy corresponding to the virtual scene, wherein the virtual scene corresponds to a first change stage corresponding to the first simulation strategy and a second change stage corresponding to the second simulation strategy;
executing a first simulation process with respect to the first variation phase based on the first simulation strategy;
executing a second simulation process with respect to the second variation phase based on the second simulation strategy.
In another embodiment of the present application, a virtual scene generation method is provided. The method comprises the following steps:
executing a simulation process related to the change of the virtual scene aiming at the virtual scene to be generated to obtain first data information of a first scene element of the virtual scene;
acquiring second data information of the first scene element;
according to the second data information of the first scene element, modifying the first data information of the first scene element to obtain target data information of the first scene element;
and generating the virtual scene by combining the target data information of the first scene element.
In yet another embodiment of the present application, an interface interaction method is provided. The method comprises the following steps:
displaying a virtual scene obtained by simulation on a user interface; the virtual scene obtained through simulation is generated by combining first data information of the first scene element obtained through simulation;
responding to second data information of the first scene element input by a user on the user interface, and correcting the first data information of the first scene element obtained by simulation according to the second data information of the first scene element to obtain target data information of the first scene element;
displaying the corrected virtual scene on the user interface; the modified virtual scene is generated in combination with the target data information of the first scene element.
In one embodiment of the present application, an electronic device is provided. The electronic device includes:
acquiring a first simulation strategy and a second simulation strategy corresponding to the virtual scene, wherein the virtual scene corresponds to a first change stage corresponding to the first simulation strategy and a second change stage corresponding to the second simulation strategy;
executing a first simulation process with respect to the first variation phase based on the first simulation strategy;
executing a second simulation process with respect to the second variation phase based on the second simulation strategy.
In another embodiment of the present application, an electronic device is provided. The electronic device includes:
executing a simulation process related to the change of the virtual scene aiming at the virtual scene to be generated to obtain first data information of a first scene element of the virtual scene;
acquiring second data information of the first scene element;
according to the second data information of the first scene element, modifying the first data information of the first scene element to obtain target data information of the first scene element;
and generating the virtual scene by combining the target data information of the first scene element.
In yet another embodiment of the present application, an electronic device is provided. The electronic device includes:
displaying a virtual scene obtained by simulation on a user interface; the virtual scene obtained through simulation is generated by combining first data information of the first scene element obtained through simulation;
responding to second data information of the first scene element input by a user on the user interface, and correcting the first data information of the first scene element obtained by simulation according to the second data information of the first scene element to obtain target data information of the first scene element;
displaying the corrected virtual scene on the user interface; the modified virtual scene is generated in combination with the target data information of the first scene element.
In yet another embodiment of the present application, a merchandise display method is provided. The method comprises the following steps:
determining a virtual scene for displaying the commodity;
acquiring a first simulation strategy and a second simulation strategy corresponding to the virtual scene, wherein the virtual scene corresponds to a first change stage corresponding to the first simulation strategy and a second change stage corresponding to the second simulation strategy;
executing a first simulation process with respect to the first variation phase based on the first simulation strategy;
executing a second simulation process with respect to the second variation phase based on the second simulation strategy;
and generating and displaying the virtual scene according to a simulation result in the first simulation process and the second simulation process.
According to the technical scheme provided by the embodiment of the application, the virtual scene is generated by simulating the change process of the virtual scene, so that the generation efficiency of the virtual scene can be improved, and the generation effect of the virtual scene can be improved. In addition, the change process of the virtual scene is simulated by adopting different simulation strategies at different change stages, so that the simulation pertinence is improved, the generation effect can be further improved, and the generated virtual scene is more natural and harmonious.
In the technical scheme provided by the embodiment of the application, in order to enhance the controllability of the virtual scene generation, the second data information of the first scene element input by the user is introduced and is applied to the simulation result, so that the controllability of the finally generated virtual scene is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present application, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1a is an exemplary diagram of a virtual scene generation method according to an embodiment of the present application;
fig. 1b is a schematic flowchart of a virtual scene generation method according to an embodiment of the present application;
fig. 1c is a diagram of another example of a virtual scene generation method according to an embodiment of the present application;
fig. 2 is a schematic flowchart of a virtual scene generation method according to another embodiment of the present application;
FIG. 3 is a schematic flowchart of an interface interaction method according to another embodiment of the present application;
fig. 4 is a block diagram of a virtual scene generation apparatus according to an embodiment of the present application;
fig. 5 is a block diagram of a virtual scene generation apparatus according to another embodiment of the present application;
FIG. 6 is a block diagram of an interface interaction device according to another embodiment of the present application;
fig. 7 is a block diagram of an electronic device according to another embodiment of the present application.
Detailed Description
Currently, there are mainly two methods to generate virtual natural scenes:
one is that a professional programmer manually draws an element density map of various resource elements, which are then visually presented in a rendering engine. The method is suitable for small-scale scene generation, but has low efficiency, can not quickly construct a super-large-scale natural scene, has poor diversity and is easy to generate small detail errors, so that the reality sense is difficult to guarantee.
The other method is a construction method based on unmanned aerial vehicle aerial multi-view scanning. The method has higher automation degree, but has the problem of insufficient precision, is only suitable for macroscopic display of the view angle of the god, cannot realize microscopic precise modeling, and is not suitable for automatic generation of game battlefield scenes or training battlefield scenes.
The inventor finds that the virtual scene can be generated by simulating the change process of the virtual scene in the process of researching the technical scheme provided by the embodiment of the application, so that the problems of low efficiency, incapability of quickly constructing a super-large-scale natural scene, poor diversity, high error possibility and the like caused by manual drawing can be solved, and the microscopic accurate modeling can be realized, so that the method is suitable for the establishment requirements of game battlefield scenes or training battlefield scenes.
In addition, the inventor further finds that, in the process of researching the technical solution provided in the embodiment of the present application, if the same simulation strategy is adopted for simulation in the whole process of the change to be simulated, it is difficult for the first data information (for example, density map) of each of the plurality of scene elements constituting the virtual scene to quickly converge to a stable equilibrium state, and once the simulation of the whole process of the change to be simulated is finished, the first data information of some scene elements has not converged to a stable equilibrium state yet, so that the finally generated virtual scene is not natural and harmonious enough, and the presentation effect is poor. Although the convergence problem can be solved to a certain extent by prolonging the variation process to be simulated, that is, prolonging the variation time to be simulated (for example, prolonging the variation time from 20 years to 30 years), the time required for simulation is increased, and the consumption of computing resources is increased, that is, by prolonging the variation process to be simulated, not only the generation efficiency of the virtual scene is reduced, but also the generation cost is increased.
The inventor finds, through analysis, that in a scheme that the same simulation strategy is adopted in the whole variation process to be simulated, the reason why the respective first data information of the plurality of scene elements is difficult to quickly converge to a stable equilibrium state is that the first data information such as density maps of some scene elements in the plurality of scene elements forming the virtual scene is easy to quickly converge (for example, the scene elements such as soil and various vegetation in the natural scene) to a stable equilibrium state, and the first data information such as density maps of some scene elements is difficult to quickly converge (for example, the scene elements such as various rocks in the natural scene) to a stable equilibrium state. If the same simulation strategy is adopted to simulate in the whole variation process to be simulated, the situation that the scene elements which are difficult to quickly converge are converged slowly or even cannot be converged easily occurs. Therefore, the scheme of simulating by using the same simulation strategy in the whole variation process to be simulated is poor in pertinence, simulation computing resources are difficult to reasonably utilize, and simulation efficiency and simulation effect are influenced.
Therefore, the inventors propose: the virtual scene is generated by simulating the change process of the virtual scene in different change stages in different simulation strategies. Therefore, the change process of the scene elements which are difficult to quickly converge is conveniently subjected to targeted simulation in a certain change stage, simulation computing resources are reasonably utilized, and convergence of the scene elements which are difficult to converge is accelerated. Therefore, the simulation efficiency can be further improved, the generation effect of the virtual scene can be further improved, and the finally obtained virtual scene is more natural and harmonious.
Interpretation of terms:
density map: and a gray scale graph representing the distribution density of the scene elements in a certain area, wherein the index value of each pixel point represents the longitude and the latitude of the point, and the gray scale value (0-1) represents the distribution density of the scene elements at the point.
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. It is to be understood that the embodiments described are only a few embodiments of the present application and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Further, in some flows described in the specification, claims, and above-described figures of the present application, a number of operations are included that occur in a particular order, which operations may be performed out of order or in parallel as they occur herein. The sequence numbers of the operations, e.g., 101, 102, etc., are used merely to distinguish between the various operations, and do not represent any order of execution per se. Additionally, the flows may include more or fewer operations, and the operations may be performed sequentially or in parallel. It should be noted that, the descriptions of "first", "second", etc. in this document are used for distinguishing different messages, devices, modules, etc., and do not represent a sequential order, nor limit the types of "first" and "second" to be different.
Fig. 1b shows a schematic flowchart of a virtual scene generation method provided in an embodiment of the present application. The execution main body of the method can be a client or a server. The client may be hardware integrated on the terminal and having an embedded program, may also be first software installed in the terminal, and may also be tool software embedded in an operating system of the terminal, which is not limited in this embodiment of the present application. The terminal can be any terminal equipment including a mobile phone, a tablet personal computer, an intelligent sound box and the like. The server may be a common server, a cloud, a virtual server, or the like, which is not specifically limited in this embodiment of the application. As shown in fig. 1b, the method comprises:
101. and acquiring a first simulation strategy and a second simulation strategy corresponding to the virtual scene.
The virtual scene corresponds to a first change phase corresponding to the first simulation strategy and a second change phase corresponding to the second simulation strategy.
102. Executing a first simulation process with respect to the first variation phase based on the first simulation strategy.
103. Executing a second simulation process with respect to the second variation phase based on the second simulation strategy.
In the above 101, the change process to be simulated of the virtual scene is divided into a plurality of change phases. The number of the plurality of change stages is at least two, and the plurality of change stages include a first change stage and a second change stage. The first simulation strategy corresponding to the first change stage is different from the second simulation strategy corresponding to the second change stage.
In practical application, corresponding simulation strategies may be set for the plurality of change stages, respectively, and the simulation strategies corresponding to different change stages may be different. Wherein each change phase corresponds to a change time period.
For example:
example 1: for example, the total duration corresponding to the change process to be simulated of the virtual scene is preset to T years, and the T years can be divided into two change phases: the 1 st change stage corresponds to the first T1 years, and the 2 nd change stage corresponds to the last T2 years, wherein T is T1+ T2.
Example 2: the total duration corresponding to the to-be-simulated change process of the virtual scene is preset to be 1 year, and the 1 year can be divided into four change stages: the 1 st change stage corresponds to spring, the 2 nd change stage corresponds to summer, the 3 rd change stage corresponds to autumn and the 4 th change stage corresponds to winter.
Example 3: the total duration corresponding to the change process to be simulated of the virtual scene is preset to be 1 day, and the 1 day can be divided into eight change stages: the 1 st change stage corresponds to early morning, the 2 nd change stage corresponds to morning, the 3 rd change stage corresponds to morning, the 4 th change stage corresponds to noon, the 5 th change stage corresponds to afternoon, the 6 th change stage corresponds to evening, the 7 th change stage corresponds to midnight and the 8 th change stage corresponds to midnight.
The virtual scene may be a virtual natural scene, a virtual space scene, or the like. When the virtual scene is a virtual natural scene, the first variation phase and the second variation phase may both relate to natural evolution, and the variation time period may also be understood as a natural evolution time period. The natural evolution comprises a terrain evolution and/or an ecological evolution. In an example, the natural evolution includes a terrain evolution and an ecological evolution. Therefore, in the whole simulation process, the terrain evolution simulation and the ecological evolution simulation are combined, the mutual influence between the terrain evolution and the ecological evolution can be described, and the improvement of the fidelity of the virtual natural scene is facilitated.
Wherein, the first simulation strategy can comprise a first event set causing the virtual scene change. A second set of events that cause a change in the virtual scenario may be included in the second simulation policy. Taking the virtual scene as the virtual natural scene as an example, the event causing the change of the virtual natural scene (i.e. the event causing the natural evolution of the virtual natural scene) includes but is not limited to: rainfall events, rock weathering events, landslide events (which can be specifically divided into landslide events such as soil, sand, rock and the like), ecological events, lightning events, earthquake events, fire events and the like. Taking the virtual scene as the virtual space scene as an example, the event causing the change of the virtual space scene includes but is not limited to: solar proton event, celestial collision event.
In addition, the first simulation strategy may further include selected probabilities and/or simulation parameters corresponding to the events in the first event set. The second simulation strategy may further include selected probabilities and/or simulation parameters corresponding to the events in the second event set. Wherein, the selection probability corresponding to each event refers to the probability of the event extracted from the corresponding event set. Simulation parameters corresponding to different events are different, for example: the simulation parameters corresponding to the rainfall event are as follows: annual rainfall, water absorption of different soil types to water, surface water evaporation capacity and the like; the simulation parameters corresponding to the rock weathering event are as follows: the annual average day and night temperature difference, the annual average day and night temperature difference threshold value and the like; the simulation parameters corresponding to the rock landslide event are as follows: a first tilt threshold, etc.; the simulation parameters corresponding to the ecological events include: growth conditions for various vegetation types, for example: temperature conditions, air temperature conditions and illumination conditions, taking the temperature conditions as an example, the temperature conditions comprise highest limit humidity, highest ideal humidity, lowest ideal humidity and lowest limit humidity; the simulation parameters corresponding to the lightning event are as follows: an altitude threshold, a second inclination threshold, etc.
In an example, the first set of events and the second set of events are different.
Continue with example 2 above: the first change stage is the 1 st change stage corresponding to spring; the second variation phase is the 4 th variation phase corresponding to the winter season. Since natural events that may occur in spring and winter are different, the first event set and the second event set may be differently set in order to improve simulation effect. The first set of events corresponding to the first variation phase may include: rainfall events, ecological events, etc.; the first set of events corresponding to the second variation phase may include: snowfall events, ecological events, and the like.
Continue with example 3 above: the first change stage is the 1 st change stage corresponding to the morning; the second change phase is the 2 nd change phase corresponding to the morning. Also, since the natural events that occur in the morning and in the morning are different, the first event set and the second event set may be set to be different in order to improve the simulation effect. For example: the first set of events corresponding to the first variation phase may include: fog events, rainfall events, and the like; the second set of events corresponding to the second variation phase may include: a rainfall event, etc.
In addition, the selected probability and the simulation parameter corresponding to the same event in the first simulation strategy and the second simulation strategy can be different. The specific configuration may be set according to actual needs, and this embodiment does not specifically limit this. The targeted simulation is realized through the difference of event sets, selected probabilities and/or simulation parameters in the simulation strategy, so that the generation effect of the virtual scene can be improved.
In 102, according to the first simulation strategy, a first simulation procedure related to the first variation phase is executed.
In 103, according to the second simulation strategy, a second simulation procedure related to the second variation phase is executed.
In an example, a dynamic change process of a virtual scene may be shown, and the method may further include: and generating and displaying the virtual scene according to a simulation result in the first simulation process and the second simulation process.
In another example, the virtual scene may be statically shown. For example: and when the simulation is finished, obtaining a virtual scene obtained by final simulation, and displaying the virtual scene obtained by final simulation.
According to the technical scheme provided by the embodiment of the application, the virtual scene is generated by simulating the change process of the virtual scene, so that the generation efficiency of the virtual scene can be improved, and the generation effect of the virtual scene can be improved. In addition, the change process of the virtual scene is simulated by adopting different simulation strategies at different change stages, so that the simulation pertinence is improved, the generation effect can be further improved, and the generated virtual scene is more natural and harmonious.
In an example, the simulation process corresponding to each change stage may be executed in sequence according to the time sequence of the plurality of change stages and based on the simulation strategy corresponding to each change stage.
In an example, the first simulation strategy and the second simulation strategy relate to respective simulation convergence rates for a plurality of scene elements of the virtual scene. Wherein the plurality of scene elements are used to compose a virtual scene. Taking a virtual scene as an example of a virtual natural scene, the scene elements of the virtual scene include, but are not limited to: crust rock stratum, soil layer, rock block layer, sand soil layer, grass layer, shrub layer, broad leaf arbor layer, needle leaf arbor layer, withered tree layer, animal layer, etc. Of course, in practical applications, in order to more finely depict the virtual scene, the scene elements may be further refined. For example: the grass layer can be subdivided into various grass layers; the broad-leaved arbor layer can be subdivided into various broad-leaved arbor layers and the like. It should be noted that the specific definitions and the number of the plurality of scene elements of the virtual scene may be set according to actual needs, and this is not specifically limited in the embodiment of the present application.
In actual application, the first simulation strategy and the second simulation strategy may be determined according to actual experience or simulation convergence conditions of test cases (i.e., test virtual scenarios).
In an implementation, the method may further include:
104. first information is acquired.
105. Determining the first simulation strategy and the second simulation strategy.
The above 104, wherein the first information relates to a simulation convergence rate corresponding to each of a plurality of scene elements of the virtual scene; the larger the simulation convergence speed is, the faster the first data information of the corresponding scene element converges to a stable state in the third simulation process; and the same simulation strategy is adopted in the third simulation process. The convergence to the stable state means that the first data information has a variation amplitude smaller than a preset amplitude as the simulation continues. The preset amplitude can be set according to actual needs, and this is not specifically limited in the embodiment of the present application. The same simulation strategy may specifically be a preset test simulation strategy for testing, and the third simulation process refers to a simulation process for executing a change of a test virtual scene based on the same simulation strategy. Wherein, the number and the types of the scene elements used for forming the test virtual scene are the same as the number and the types of the scene elements used for forming the virtual scene. In order to reduce the calculation cost, the scale of the test virtual scene can be smaller than that of the virtual scene.
In practical application, a third simulation process related to the change of the test virtual scene can be executed based on the same simulation strategy; in combination with the convergence condition of the first data information of each of the multiple scene elements of the test virtual scene in the third simulation process, first information about the simulation convergence speed corresponding to each of the multiple scene elements of the test virtual scene may be obtained as first information about the simulation convergence speed corresponding to each of the multiple scene elements of the test virtual scene.
Generally, the simulation convergence rate of the soil layer is greater than that of the vegetation layer, which is greater than that of the rock block layer. In an example, the first information may specifically be simulation convergence speeds corresponding to the plurality of scene elements.
In this example, the same simulation strategy is adopted in the third simulation process, so that the determined simulation convergence speed is representative and more reasonable. In an example, the same simulation strategy includes a third event set causing a test virtual scene change and a selected probability corresponding to each event in the third event set; the third event set may specifically be composed of a preset plurality of events (the preset plurality of events may be understood as all events related to the virtual scene); and the selected probabilities corresponding to the events in the same simulation strategy are the same. In this way, the representativeness and rationality of the finally determined simulation convergence speed can be further improved.
The above 105, wherein the first simulation policy and the second simulation policy are set in conjunction with the first information.
In one example, dividing a change process to be simulated of a test virtual scene to obtain a plurality of change stages corresponding to the test virtual scene; setting a corresponding initial simulation strategy aiming at each change stage in a plurality of change stages of the test virtual scene by combining the first information; executing a fourth simulation process corresponding to each change stage of the test virtual scene on the basis of the simulation strategy corresponding to each change stage in the multiple change stages of the test virtual scene in sequence; optimizing the simulation strategy corresponding to each change stage in the multiple change stages of the test virtual scene by combining the convergence condition of the first data information of each scene element of the test virtual scene in the fourth simulation process until the simulation strategy corresponding to each change stage in the multiple change stages of the test virtual scene meeting the preset condition is obtained; determining simulation strategies corresponding to a plurality of change stages of a test virtual scene meeting preset conditions as simulation strategies corresponding to a plurality of change stages of the virtual scene. The plurality of change phases of the virtual scene include the first change phase and the second change phase. The plurality of change phases of the virtual scene correspond to the plurality of change phases of the test virtual scene one to one.
The preset conditions may be set according to actual needs, and this embodiment does not specifically limit this. The preset condition is related to the convergence condition of the first data information of each of the scene elements of the test virtual scene.
The embodiment provides a scheme for automatically optimizing the simulation strategies corresponding to each change stage. Of course, in practical application, the simulation strategies corresponding to the multiple change stages of the test virtual scene that satisfy the preset condition may also be obtained based on a manual adjustment mode, which is not specifically limited in this embodiment of the application.
In one example, the first data information may include a first density map. Of course, in practical applications, the first data information may further include other data, for example, the vegetation layer, and the first data information may further include a vegetation size distribution map, a vegetation age distribution map, and the like. The vegetation size distribution map can be understood as: a gray scale map representing the size of the scene element (e.g., the height of the tree) in a certain area, the index value of each pixel point represents the longitude and latitude of the point, and the gray scale value (0-1) represents the size of the scene element in the point. The vegetation age profile can be understood as: a gray scale graph representing the age (e.g., tree age) of the scene element in a certain area, the index value of each pixel point represents the longitude and latitude of the point, and the gray scale value (0-1) represents the age of the scene element at the point.
In practical application, a first simulation process related to the first variation phase and a second simulation process related to the second variation phase are executed, so as to adjust the first data information of each of the plurality of scene elements. Before the step 102, the first data information of each of the plurality of scene elements may also be initialized, specifically, the first data information of each of the plurality of scene elements is initialized to serve as a simulation basis.
In an example, an initial value of the first data information of each of the plurality of scene elements may be determined to be 0, that is, the gray-scale value at each pixel point in the first density map of each scene element is set to 0. Of course, the initial value of the first data information of each of the plurality of scene elements may be set according to actual needs, which is not specifically limited in this embodiment of the application.
In an implementation manner, in the above 105, "determining the first simulation strategy and the second simulation strategy" specifically includes:
1051. and determining a first event set causing the virtual scene change in the first simulation strategy and a second event set causing the virtual scene change in the second simulation strategy by combining the first information.
The occurrence of any event may cause a change in the corresponding scene element, i.e. cause an adjustment of the first data information of the corresponding scene element. In general, different events cause changes to different scene elements. In order to accelerate convergence of the scene element with a slow convergence rate, in a certain change phase, a targeted simulation may be performed on a change process of the scene element with a slow convergence rate, that is, for a certain change phase, at least one event that affects convergence of the scene element with a slow convergence rate is made into an event set in a simulation strategy corresponding to the change phase. Therefore, at the change stage, only at least one event influencing the convergence of the scene element with low convergence speed is simulated without simulating other events, so that the simulation computing resources are reasonably utilized, the convergence of the scene element with low convergence speed is accelerated, and the generation effect is improved.
Specifically, in 1051, "determining a first event set causing the virtual scene change in the first simulation policy and a second event set causing the virtual scene change in the second simulation policy, in combination with the first information," may specifically be implemented by the following steps:
s11, combining the first information, dividing the scene elements into a plurality of convergence levels.
Wherein, the higher the convergence level is, the greater the simulation convergence speed of the corresponding scene element is.
And S12, dividing the change process to be simulated related to the virtual scene into a plurality of change stages.
The number of the plurality of convergence levels is the same as the number of the plurality of variation phases; the number of the plurality of variation phases is n; the plurality of variation phases includes the first variation phase and the second variation phase.
S13, determining an event set in the simulation strategy corresponding to the ith change stage according to at least one event which affects the convergence of scene elements at the ith convergence level and below in a plurality of preset events.
Wherein i is an integer and has a value range of [1, n ]; the ith convergence level is higher than the (i + 1) th convergence level, and the (i + 1) th variation phase precedes the (i + 1) th variation phase.
In S11, the convergence levels of the scene elements may be divided according to the magnitude of the simulated convergence rate of each scene element.
In the above step S12, the change process to be simulated related to the virtual scene is divided to obtain a plurality of change stages of the virtual scene.
In the above S13, the preset plurality of events may be understood as all events related to the virtual scene.
All or part of the preset events in at least one event which affects convergence of scene elements at the ith convergence level and below convergence levels in the plurality of events can be combined into an event set in the first simulation strategy corresponding to the ith change stage.
For example: for the 1 st change phase, the 1 st convergence level and the following convergence level scene elements refer to all scene elements, so that a plurality of preset events can be combined into an event set in the simulation strategy corresponding to the 1 st change phase.
In this way, after the convergence of the ith convergence level scene element is completed in the ith change stage, the targeted simulation may be performed only on the non-convergence scene element (i.e. the scene element with the convergence level smaller than the ith convergence level) in the subsequent other change stages. Therefore, simulation computing resources can be reasonably utilized, and the convergence of the unconverged scene elements is accelerated, so that the simulation efficiency and the simulation effect are improved.
For example: the plurality of variation phases only include a 1 st variation phase and a 2 nd variation phase; the 1 st change phase precedes the 2 nd change phase. According to the first information, a plurality of scene elements are divided into 2 convergence levels: a 1 st convergence level and a 2 nd convergence level. Determining an event set in a simulation strategy corresponding to the 1 st change stage according to a plurality of preset events; and determining an event set in the simulation strategy corresponding to the 2 nd change stage according to at least one event influencing the convergence of the 2 nd convergence level scene element in a plurality of preset events. For example: the plurality of scene elements includes: a soil layer, a vegetation layer and a rock block layer; wherein, the convergence levels of the soil layer and the vegetation layer are both 1 st convergence level; the convergence level of the rock block layer is the 2 nd convergence level. The preset events comprise: ecological events, rainfall events, lightning events, rock weathering events, and rock landslide events. And then, combining the ecological events, the rainfall events, the lightning events, the rock weathering events and the rock landslide events into an event set in the simulation strategy corresponding to the 1 st change stage. Among the events that affect convergence of rock masses are lightning events, rock weathering events, and rock landslide events. Therefore, all or part of the lightning event, the rock weathering event and the rock landslide event can be combined into an event set in the simulation strategy corresponding to the 2 nd change stage. In practical application, considering that a lightning event and a rock weathering event can generate new rock blocks, a rock landslide event can influence the distribution condition of the generated rock blocks, and in order to avoid the situation that the generation of the new rock blocks can prolong the convergence time, the rock landslide event can be only formed into an event set in the simulation strategy corresponding to the 2 nd change stage.
In practical applications, the "executing the first simulation process related to the first variation phase based on the first simulation strategy" in the above 102 may specifically be implemented by the following steps:
and S21, acquiring a first event set in the first simulation strategy.
And S22, aiming at the first time point in the first change phase, executing a simulation process of generating the selected event at the selected position.
S23, adjusting the first data information of each scene element according to the simulation result of the selected event at the selected position.
Wherein the selected place is selected from the virtual scene for the first point in time; the selected event is selected from the first set of events for the first point in time.
In the above S21, the process of determining the first event set may refer to corresponding contents in the above embodiments, and is not described herein again.
In S22, the first time point refers to an arbitrary time point in the first variation phase.
In one example, the selected location may be extracted from the virtual scene by performing an equiprobable sampling for the first time point.
In addition, the selected event may be an event that is extracted from the first event set by performing an equiprobable sampling for the first time point.
In another example, the first simulation policy further includes a selected probability corresponding to each event in the first event set. Thus, the method may further include: and aiming at the first time point, combining the selected probability corresponding to each event in the first event set, and executing unequal probability sampling to extract one event from the first event set as the selected event.
The selected probability corresponding to each event in the first event set can be determined according to historical climate data of a real area corresponding to the virtual scene, so that the virtual scene more fitting the actual situation can be simulated.
It is necessary to supplement that, a traversal mode can be adopted to traverse all the time points in the first change stage; and the step of S22 described above is executed for all time points within the first variation phase. Specifically, a time point may be selected at preset natural evolution time intervals (e.g., 1 day, one week).
Optionally, the first simulation strategy corresponding to the first change phase further includes simulation parameters corresponding to each event in the first event set. In the above S22, "executing a simulation process of generating a selected event at a selected location for a first time point in the first change phase" specifically includes: and aiming at the first time point, combining the simulation parameters corresponding to the selected event in the first simulation strategy, and executing the simulation process of the selected event at the selected place. In this way, the authenticity of the simulation can be improved.
Because the selected place and the selected event are randomly selected, if the selected place and the selected event are not judged, the simulation process of generating the selected event at the selected place is directly executed, and unreasonable places may appear. For example: generally, lightning events occur at high altitudes and protruding terrains; little to no occurrence in low elevation, smooth terrain. If the selected event is a lightning event and the selected place is a place with low altitude and smooth terrain, the simulation process of the selected event occurring in the selected place is not required to be executed, otherwise, the finally generated virtual natural scene is not real and harmonious enough. Therefore, in practical applications, the step S22 of "executing a simulation process of generating a selected event at a selected point with respect to a first time point in the first variation phase" may specifically be executed by:
s221, acquiring the state information of the selected place at the first time point.
Wherein the state information is related to the selected event.
S222, determining the occurrence probability of the selected event at the selected place according to the state information and the simulation parameters corresponding to the selected event.
And S223, when the occurrence probability is greater than or equal to a first preset threshold value, executing a simulation process of the selected event occurring at the selected place aiming at the first time point.
In the above S221, the state information related to different events is different. In an example, the state information may be determined according to first data information of each of the plurality of scene elements at the first time point and/or simulation parameters corresponding to the selected event.
When the virtual scene is a virtual natural scene, the state information may be determined according to a terrain height map of the virtual scene, first data information of each of the plurality of scene elements at the first time point, and/or simulation parameters corresponding to the selected event.
The following description will be given by taking a virtual scene as a virtual natural scene:
taking the selected event as the lightning event as an example, the state information may include: the altitude and inclination of the selected location at the first point in time. The greater the inclination, the more prominent the terrain; the smaller the inclination, the flatter the terrain.
Taking the selected event as a rock landslide event as an example, the rock landslide event generally occurs in a place with a high inclination and a low vegetation coverage, so the state information may include: the inclination of the selected place at the first time point and the vegetation coverage degree.
Taking the selected event as a rock weathering event as an example, the rock weathering event generally occurs in a place with large day-night temperature difference and bare rocks, so the state information may include: the degree of coverage of the soil, sand and/or vegetation at the first time point of the selected site and the diurnal temperature differential at the first time point of the selected site. Wherein, the annual day and night temperature difference can be obtained by analyzing the meteorological data of the real place corresponding to the selected place.
In the above step S222, the simulation parameters corresponding to different events are different.
Taking the selected event as a lightning event as an example, simulation parameters corresponding to the lightning event include: an altitude threshold, a second inclination threshold.
Taking the selected event as a rock landslide event as an example, simulation parameters corresponding to the rock landslide event include: a first inclination threshold value and a first vegetation coverage threshold value.
Taking the selected event as a rock weathering event as an example, simulation parameters corresponding to the rock weathering event include: the annual average day and night temperature difference, the annual average day and night temperature difference threshold value and the second coverage degree threshold value. The annual end day and night temperature difference and the annual end day and night temperature difference threshold value can be obtained according to historical climate data of a real area corresponding to the virtual scene.
In the above S223, taking the selected event as an example of a lightning event, the greater the altitude of the selected place at the first time point is greater than the altitude threshold, and the greater the inclination of the selected place at the first time point is greater than the second inclination threshold, the greater the occurrence probability of the selected event occurring at the selected place is.
Taking the selected event as a rock landslide event as an example, the more the inclination of the selected place at the first time point is greater than the first inclination threshold, and the less the vegetation coverage degree of the selected place at the first time point is than the first vegetation coverage degree threshold, the greater the occurrence probability of the selected event occurring at the selected place.
Taking the selected event as a rock weathering event as an example, the coverage degree of soil, sand and/or vegetation of the selected place at the first time point is lower than the second coverage degree threshold, the annual average day-night temperature difference of the selected place at the first time point is higher than the annual average day-night temperature difference threshold, and the occurrence probability of the selected event at the selected place is higher.
In the above step S222, according to the state information and the simulation parameter corresponding to the selected event, a specific implementation process for determining the occurrence probability of the selected event at the selected location may be set according to actual needs, for example: a function for calculating the occurrence probability can be designed for each event, the state information and the simulation parameters corresponding to the selected event are directly input into the function subsequently, and the occurrence probability can be obtained by executing the function.
In the above S223, the size of the first preset threshold may be set according to actual needs, which is not specifically limited in this embodiment.
And when the occurrence probability is greater than or equal to a first preset threshold value, executing a simulation process of the selected event at the selected place aiming at the first time point. And when the occurrence probability is smaller than a first preset threshold value, ignoring the first time point, namely not executing the simulation process of the selected event at the selected place aiming at the first time point.
Inspired by random occurrence of events in the nature, the scheme provided by the embodiment of the application defines a series of events, including rainfall events, rock weathering events, landslide events, ecological events, lightning events, fire events, earthquake events and the like, which act on natural scenes independently of each other, so that an event can be randomly selected in a certain time period, randomly occurs in a certain position of the natural scene, and gradually spreads to adjacent areas along a path on the terrain until a certain time is over. To simplify the computation and ensure stable convergence of the simulation algorithm in a short time, we define the following rules for the events:
1. when the event spreads along the terrain path, the event can only spread to one adjacent point at most, and the event can not flow back, so that the condition that the event path is branched or a loop is formed is avoided.
2. The effect of an event on the terrain has a range, which cannot be infinite.
3. The time interval between the occurrence of two adjacent events is sufficiently small relative to the total time of the simulation.
The influence of each event on the first density map of each scene element in the virtual scene is described below:
a rainfall event:
acquiring annual rainfall of a real area corresponding to the virtual scene according to the climate data; and according to the geological data, obtaining the soil type, the water absorption rate, the surface moisture evaporation capacity and other information of the area, and simulating to obtain the soil distribution condition and the surface humidity distribution map of the area by utilizing rainfall events.
When a rainfall event occurs at a certain point on the terrain, the initial rainfall amount at the point is the product of annual rainfall amount and event time interval (the unit may be specifically year, namely the preset natural evolution time interval), and then the initial rainfall amount is diffused along the path on the terrain to influence other areas until the rainfall amount is zero or the rainfall amount reaches the low-lying position of the terrain. When the rainfall event is spread along a delay path, two problems must be solved, namely the selection of the next spreading point and the influence on surrounding neighborhood points. And selecting the next diffusion point, namely excluding the points with the height higher than the current height from the neighborhood points, and then randomly selecting one point from the rest points, wherein if the height of a certain point is lower than the height of the current point, the probability of selection is higher. After the next point is selected, the rainfall flows from the current point to the next point, and due to the fact that soil absorbs moisture, the rainfall is partially absorbed by the current point, and the rest flows to the next point. The influence on surrounding neighborhood points is complex, and the phenomena of flowing water erosion and sandy soil loss are simulated at the same time, wherein the flowing water erosion refers to that stones gradually change into sand from large to small along a flowing path along the terrain along which water flows, and edges of the stones gradually change from sharp to smooth and become cobblestones gradually. The phenomenon of sand loss refers to the phenomenon that soil sand migrates with flowing water and is accumulated on a low-lying smooth part, such as a delta phenomenon that a river flows into a sea mouth. Both of these phenomena can occur by incrementally modifying the first density map of scene elements on the water flow path.
Rock weathering events:
the rock weathering event is mainly influenced by the day and night temperature difference of a certain place, because the moisture in rock pores can be frozen and trigger the rock to be broken into small rocks when the temperature is reduced at night, on the other hand, the rock temperature difference can not be too high due to the covering of surface soil, sand and vegetation, and the effect of inhibiting the weathering event is achieved, so the day and night temperature difference of the certain place and the distribution condition of the soil, the sand and the vegetation are considered in the simulation of the rock weathering event. According to the climate data, the annual average day and night temperature difference of a certain place can be inquired, the distribution density of soil, sand and vegetation can be obtained from the first data information of each scene element, then the probability of rock weathering events occurring in the certain place can be estimated, the greater the day and night temperature difference is, the higher the probability is, the denser the earth surface soil, sand and vegetation are, and the lower the probability is. When the probability value exceeds a preset proximity threshold, a rock weathering event is triggered, at which point the density value of the rock mass is increased.
Landslide event:
due to the gravity of the earth, rocks, sands and soil can generate landslide events, and corresponding landslide thresholds of the rocks, the sands and the soil are inquired according to physical attributes of the rocks, the sands and the soil. When a landslide event is randomly triggered at a certain point on the terrain, the landslide event can slide towards the lower part like rainwater until the inclination angle of the certain point is smaller than the landslide threshold value, so that a large amount of rocks, sands and soils can be accumulated at the lower part. Meanwhile, the earth surface vegetation can protect the soil, when the earth surface vegetation has vegetation distribution, rocks, sand and soil are less prone to loss, in order to simplify simulation, a landslide threshold value at a certain position of the earth surface is set to be in positive correlation with the earth surface vegetation distribution density, and the occurrence probability of landslide events is reduced. On the contrary, when a landslide event occurs, the vegetation on the ground can be damaged, and the vegetation distribution density of the site is reduced.
Ecological events:
in an ecological event, we simulated the cyclic interaction process of plants and soil. The growth of plants is limited by conditions such as soil humidity, air temperature, illumination and the like, different types of plants have different preferences on soil humidity, air temperature and illumination, the generation conditions of different plants can be obtained by inquiring data in the aspect of biological climate according to different plant types (grass, shrub, broad-leaf arbor, conifer arbor and the like) preset in a hierarchical data structure, and by taking the soil humidity as an example, the lowest limit humidity, the lowest ideal humidity, the highest ideal humidity and the highest limit humidity can be obtained. To simplify the simulation, we used piecewise linear functions to simulate the effect of certain climatic factors on plant growth. Obtaining plant growth values corresponding to humidity, air temperature and illumination, and taking the minimum value of the three growth values as the final growth value of the plant for judging whether the plant grows, dies or sprouts and correspondingly modifying the density value of each layer of the plant in the hierarchical data structure as the plant growth value only can grow under the condition that the humidity, the air temperature and the illumination are all met.
Lightning events:
on the surface of the recent geological literature material, the lightning event not only destroys the vegetation, but also generates a large amount of rock blocks, a single lightning strike can generate ton rock blocks, and ten thousand cubic rock blocks can be generated per square kilometer in one hundred years, so that the lightning event has influence on the density values of plants and rock blocks, and on the other hand, the occurrence of the lightning event is related to the terrain, the higher the altitude is, the more prominent the terrain is, and the higher the probability of the lightning strike occurrence time is. If a lightning strike event is triggered, the vegetation layer density in the hierarchical data structure decreases, the deadtree layer density increases, and the stone layer density also increases.
The multiple events mainly simulate the influence of natural phenomena on the terrain, although each single-point event only has limited influence on the terrain locally, after simulation deduction of a plurality of time periods, the hierarchical data structure gradually converges to a stable balanced state, so that the distribution of each resource accords with various phenomena in the nature, such as large blocks of rocks with clear edges and corners at the tops and the waists of the mountains, a plurality of small blocks of cobblestones at the edges of rivers, fertile land in the delta zones formed by river impact, luxurious plant growth, a plurality of cold-resistant plants in the mountains, and the like. Meanwhile, the event sets support the extensible, and the earthquake events, the fire events, the shot events and the like can be customized to support the generation of richer natural scenes.
Optionally, the method further includes:
106. and generating the virtual scene according to the respective first data information of the scene elements obtained by simulation.
The first data information of each of the plurality of scene elements obtained by simulation is obtained when the virtual scene simulation is finished.
In order to enhance the controllability of the generation, second data information of the first scene element input by the user is introduced; the second data information of the first scene element can be manually drawn or obtained after space-based remote sensing data analysis; the output with stronger controllability can be obtained by applying the simulation result to the simulation result. Taking a natural scene as an example, because the formation of the natural scene is influenced by various factors, the result generated in the simulation process mainly considers the influence of various phenomena in the nature on the terrain, and meanwhile, human activities are also important factors influencing the natural scene, so that a user can introduce the influence of the human activities on the natural scene by inputting the second data information of the first scene element.
Therefore, in the above 106, "generating the virtual scene according to the first data information of each of the plurality of scene elements obtained by simulation" may specifically be implemented by the following steps:
and S31, acquiring second data information of the first scene element.
S32, according to the second data information of the first scene element, modifying the first data information of the first scene element obtained through simulation to obtain target data information of the first scene element.
And S33, combining the target data information of the first scene element to generate the virtual scene.
In S31, the first scene element refers to any one of the scene elements of the virtual scene. The second data information of the first scene element may be determined in one of the following ways:
A. and receiving second data information of the first scene element, which is edited by a user on a manual editing interface.
B. And generating second data information of the first scene element according to the remote sensing data of the real area corresponding to the virtual scene.
In the above mode a, the initialized second data information of the first scene element is provided in the manual editing interface; and in response to the editing operation of the user on the initialized second data information of the first scene element, modifying the initialized second data information of the first scene element to obtain the second data information of the first scene element. Taking the second data information as a second density map as an example, initializing gray values of all pixel points in the second density map to be 1, and providing the initialized second density map of the first scene element on a manual editing interface; and responding to the selection operation of the user on the initialized second density map of the first scene element aiming at the first area, and setting the gray value of each pixel point in the first area to be 0 so as to obtain the second density map of the first scene element.
The specific implementation of the above mode B can be referred to in the prior art, and is not described herein again.
In S32, generally, taking a natural scene as an example, human activities may reduce the distribution of certain resources at a certain location. In one implementation, therefore, the first data information includes a first density map; the second data information comprises a second density map; the plurality of scene elements comprises a first scene element; according to the second density map of the first scene element, correcting the first density map of the first scene element obtained by simulation to obtain a target density map of the first scene element, which can be specifically realized by adopting the following steps:
s321, if the gray value at the first pixel point in the first density map of the first scene element obtained through simulation is greater than the gray value at the corresponding pixel point in the second density map of the first scene element, replacing the adjusted gray value at the first pixel point in the first density map of the first scene element with the gray value at the corresponding pixel point in the second density map of the first scene element.
The first pixel point refers to any pixel point in a first density map of the first scene element obtained through simulation.
Optionally, the method may further include:
106. and constructing a multilayer ordered data structure according to the hierarchical relationship of the scene elements.
Wherein each layer of data in the multi-layer ordered data structure is used to represent first data information of a corresponding scene element.
The hierarchical relationship of the plurality of scene elements may specifically be a bottom-to-top hierarchical relationship of the plurality of scene elements in the virtual scene. For example: the distribution of a plurality of scene elements from bottom to top is as follows: crust rock stratum, soil layer, rock block layer, sand soil layer, grass layer, shrub layer, broad leaf arbor layer, needle leaf arbor layer, withered tree layer, animal layer, etc.
In the above 106, "generating the virtual scene according to the respective first data information of the plurality of scene elements obtained through simulation" specifically includes: and generating the virtual scene according to each layer of data in the multilayer ordered data structure obtained by simulation.
The number of the layers of the multilayer ordered data structure is extensible, and the multilayer ordered data structure can be specifically extended according to actual needs, so that the universality of the scheme is improved.
In addition, considering that the terrain height distribution of the natural scene also affects the natural evolution of the natural scene, when the virtual scene is a virtual natural scene, the first change stage is related to the natural evolution, and the method may further include: and acquiring a terrain height map of the virtual scene. Accordingly, in the above 102, "executing the first simulation process related to the first variation phase based on the first simulation strategy" specifically includes: and executing a first simulation process related to the first change stage based on the first simulation strategy in combination with the terrain height map. In the embodiment, the simulation is performed in combination with the terrain height map, which is helpful for improving the reality of the simulation.
In an example, the virtual scene may be generated according to the respective first data information of the plurality of scene elements obtained through simulation and a terrain height map of the virtual scene. Specifically, a scene element model corresponding to each of the plurality of scene elements and a terrain texture corresponding to each of the plurality of scene elements may be obtained; and inputting a scene element model corresponding to each of the plurality of scene elements, a terrain material corresponding to each of the plurality of scene elements, first data information of each of the plurality of scene elements obtained through simulation, and the terrain height map into a rendering engine to generate the virtual scene.
The rendering Engine may be UE4(Unreal Engine 4). The generated virtual scene may specifically be a 3D model, and the scene element model corresponding to each scene element may also specifically be a 3D model, for example: 3D vegetation models, 3D rock block models, 3D soil models, and the like.
The change process of the virtual scene is simulated by adopting different simulation strategies at different change stages, so that the simulation pertinence is improved, the generation effect can be further improved, and the generated virtual scene is more natural and harmonious.
It should be added that, in the above-mentioned embodiment 103, the specific implementation process of "executing the second simulation process related to the second variation phase based on the second simulation strategy" may refer to the related content of "executing the first simulation process related to the first variation phase based on the first simulation strategy" in the above-mentioned embodiments, and will not be described in detail herein.
First application scenario example (as shown in fig. 1 a):
step 100, a user can input a terrain height map of a virtual natural scene to be generated through an import control 10 for importing the terrain height map provided on a user interface.
Step 200, executing a simulation algorithm to adjust respective first data information of a plurality of scene elements for constituting the virtual natural scene.
The specific implementation process of the simulation algorithm is as follows: acquiring a first simulation strategy and a second simulation strategy corresponding to the virtual natural scene, wherein the virtual natural scene corresponds to a first change stage corresponding to the first simulation strategy and a second change stage corresponding to the second simulation strategy; executing a first simulation process regarding the first variation phase based on the first simulation strategy to adjust first data information for each of a plurality of scene elements constituting a virtual natural scene; based on the second simulation strategy, a second simulation process is performed with respect to the second variation phase to adjust respective first data information of a plurality of scene elements constituting the virtual natural scene.
And 300, generating the virtual natural scene according to the respective first data information of the scene elements obtained when the simulation is finished.
And step 400, outputting and displaying the generated virtual natural scene on a user interface.
The specific implementation of the above steps can refer to the corresponding content in the above embodiments, and is not described herein again.
The second application scenario is for example (as shown in fig. 1 c):
the technical solution provided by the embodiment of the present application will be described below by way of example with reference to fig. 1 c:
step 500, a user may input a terrain height map of a virtual natural scene to be generated through an import control 10 provided on a user interface for importing the terrain height map.
Step 600, executing a simulation algorithm to adjust respective first data information of a plurality of scene elements for constituting the virtual natural scene.
The specific implementation process of the simulation algorithm is as follows: dividing a change process to be simulated of the virtual natural scene into n change stages; setting a corresponding simulation strategy for each change stage; the simulation strategies corresponding to different change stages are different; and executing the simulation process related to each change stage according to the time sequence of the n change stages and the simulation strategy corresponding to each change stage in sequence so as to adjust the respective first data information of the plurality of scene elements for forming the virtual natural scene.
Step 700, generating the virtual natural scene according to the adjusted first data information of the scene elements.
And step 800, outputting and displaying the generated virtual natural scene on a user interface.
The specific implementation of the above steps can refer to the corresponding content in the above embodiments, and is not described herein again.
In the prior art, although the scheme of manual drawing by professional programmers can meet the controllability requirement, small-scale natural scenes can be manufactured. However, for natural scenes with large scale, the manufacturing efficiency is low, and the manual manufacturing is monotonous and easy to make mistakes, so that the requirements of large scale, diversity, reality and the like are difficult to meet. In the prior art, the construction scheme based on unmanned aerial vehicle aerial multi-view scanning is high in automation degree, large-scale natural scenes can be manufactured, but the precision is not high, the close-range display effect is poor, and the data source is single in utilization. The 3D natural scene automatic generation method based on multi-strategy multi-event simulation reasonably utilizes multivariate data, deeply describes the forming principle of the natural scene, can quickly and automatically generate the natural scene with large scale, is suitable for macroscopic display of the scene, is convenient for close microscopic display, and meets the requirements of controllability, diversity, verisimilitude and the like.
Fig. 2 shows a flowchart of a virtual scene generation method according to an embodiment of the present application. The execution main body of the method can be a client or a server. The client may be hardware integrated on the terminal and having an embedded program, may also be first software installed in the terminal, and may also be tool software embedded in an operating system of the terminal, which is not limited in this embodiment of the present application. The terminal can be any terminal equipment including a mobile phone, a tablet personal computer, an intelligent sound box and the like. The server may be a common server, a cloud, a virtual server, or the like, which is not specifically limited in this embodiment of the application. As shown in fig. 2, the method includes:
201. and executing a simulation process related to the change of the virtual scene aiming at the virtual scene to be generated to obtain first data information of a first scene element of the virtual scene.
202. Second data information of the first scene element is obtained.
203. And modifying the first data information of the first scene element according to the second data information of the first scene element to obtain target data information of the first scene element.
204. And generating the virtual scene by combining the target data information of the first scene element.
In the above 201, in an example, the same simulation strategy may be adopted for simulation in the to-be-simulated change process of the virtual scene. Namely, the same simulation strategy is adopted to execute the simulation process related to the change of the virtual scene. The specific implementation of executing the simulation process related to the virtual scene change by using the same simulation strategy may refer to the implementation process of the first simulation process in each of the above embodiments.
In another example, the virtual scene may be simulated using different simulation strategies using different variation phases. Specifically, in the above 201, "execute a simulation process related to a change of a virtual scene with respect to the virtual scene to be generated", specifically, the following steps may be adopted to implement:
2011. and acquiring a first simulation strategy and a second simulation strategy corresponding to the virtual scene.
The virtual scene corresponds to a first change phase corresponding to the first simulation strategy and a second change phase corresponding to the second simulation strategy.
2012. Executing a first simulation process with respect to the first variation phase based on the first simulation strategy.
2013. Executing a second simulation process with respect to the second variation phase based on the second simulation strategy.
The specific implementation of the steps 2011, 2012 and 2013 can refer to the corresponding content in the embodiments, and is not described herein again.
In 202, the second data information of each of the scene elements may be input by a user.
The second data information of the first scene element may be determined in one of the following ways:
A. and receiving second data information of the first scene element, which is edited by a user on a manual editing interface.
B. And generating second data information of the first scene element according to the remote sensing data of the real area corresponding to the virtual scene.
For specific implementation of the mode a and the mode B, reference may be made to corresponding contents in the above embodiments, and details are not described herein.
For specific implementation of the step 203 and the step 204, reference may be made to corresponding contents in the foregoing embodiments, and details are not described herein.
In the technical scheme provided by the embodiment of the application, in order to enhance the controllability of the generation, second data information of a plurality of scene elements input by a user is introduced and applied to a simulation result, so that the controllability of a finally generated virtual scene is improved.
Here, it should be noted that: the content of each step in the method provided by the embodiment of the present application, which is not described in detail in the foregoing embodiment, may refer to the corresponding content in the foregoing embodiment, and is not described herein again. In addition, the method provided in the embodiment of the present application may further include, in addition to the above steps, other parts or all of the steps in the above embodiments, and specific reference may be made to corresponding contents in the above embodiments, which is not described herein again.
Fig. 3 is a flowchart illustrating an interface interaction generating method according to an embodiment of the present application. The execution main body of the method can be a client or a server. The client may be hardware integrated on the terminal and having an embedded program, may also be first software installed in the terminal, and may also be tool software embedded in an operating system of the terminal, which is not limited in this embodiment of the present application. The terminal can be any terminal equipment including a mobile phone, a tablet personal computer, an intelligent sound box and the like. The server may be a common server, a cloud, a virtual server, or the like, which is not specifically limited in this embodiment of the application. As shown in fig. 3, the method includes:
301. and displaying the virtual scene obtained by simulation on a user interface.
And generating the virtual scene obtained by simulation by combining the first data information of the first scene element obtained by simulation. The first data information of the first scene element is obtained by adjustment according to a simulation process related to the virtual scene change.
302. And responding to second data information of the first scene element input by a user on the user interface, and correcting the simulated first data information of the first scene element according to the second data information of the first scene element to obtain target data information of the first scene element.
303. And displaying the corrected virtual scene on the user interface.
Wherein the modified virtual scene is generated in combination with target data information of the first scene element.
The method may further include:
304. and executing a simulation process related to the change of the virtual scene aiming at the virtual scene to be generated to obtain first data information of a first scene element of the virtual scene.
For the specific implementation of the above-mentioned step 304, reference may be made to the corresponding contents in the above-mentioned embodiments, and details are not described herein again.
In 302, if the user considers that the virtual scene displayed on the user interface and generated based on the first data information of each of the plurality of scene elements obtained through simulation is not satisfactory, the second data information of the first scene element may be input on the user interface for correction. An input control can be set on the user interface, and the second data information of the first scene element is imported in response to the triggering operation of the user for the input control.
The second data information of the plurality of scene elements may be determined in one of the following ways:
the second data information of the first scene element may be determined in one of the following ways:
A. and receiving second data information of the first scene element, which is edited by a user on a manual editing interface.
B. And generating second data information of the first scene element according to the remote sensing data of the real area corresponding to the virtual scene.
For specific implementation of the mode a and the mode B, reference may be made to corresponding contents in the above embodiments, and details are not described herein.
In 302, for a specific implementation of "modifying the first data information of the first scene element obtained through simulation according to the second data information of the first scene element to obtain the target data information of the first scene element", reference may be made to corresponding contents in the foregoing embodiments, and details are not described here again.
303, the virtual scene obtained by the simulation and the modified virtual scene may be displayed in parallel on the user interface. Of course, the virtual scene obtained by the simulation may be replaced with the modified virtual scene in the user interface for display.
In the technical scheme provided by the embodiment of the application, in order to enhance the controllability of the generation, second data information of a plurality of scene elements input by a user is introduced and applied to a simulation result, so that the controllability of a finally generated virtual scene is improved.
Here, it should be noted that: the content of each step in the method provided by the embodiment of the present application, which is not described in detail in the foregoing embodiment, may refer to the corresponding content in the foregoing embodiment, and is not described herein again. In addition, the method provided in the embodiment of the present application may further include, in addition to the above steps, other parts or all of the steps in the above embodiments, and specific reference may be made to corresponding contents in the above embodiments, which is not described herein again.
The embodiment of the application also provides a commodity display method. The method comprises the following steps:
120. a virtual scene for displaying the commodity is determined.
122. And acquiring a first simulation strategy and a second simulation strategy corresponding to the virtual scene.
The virtual scene corresponds to a first change phase corresponding to the first simulation strategy and a second change phase corresponding to the second simulation strategy.
124. Executing a first simulation process with respect to the first variation phase based on the first simulation strategy.
126. Executing a second simulation process with respect to the second variation phase based on the second simulation strategy.
128. And generating and displaying the virtual scene according to a simulation result in the first simulation process and the second simulation process.
In the above 120, the commodity may be a house, a tourist attraction, a fresh product, etc.
In the aforementioned step 122, in a specific application scenario, in order to facilitate the lighting condition of the house to be sold by the user, the lighting condition of the house to be sold in one day may be displayed on the house trading platform. Specifically, the changing process to be simulated for the virtual house scene showing the house may be divided into a first changing phase corresponding to the am and a second changing phase corresponding to the afternoon. A first simulation strategy suitable for the morning and a second simulation strategy suitable for the afternoon can be set in advance according to actual needs, so that targeted simulation is achieved, and the simulation effect is improved.
In another specific application scenario, in order to facilitate users to know tourist attractions (such as yellow mountains), landscape changes of the tourist attractions all the year round can be displayed on the tourist e-commerce platform. Specifically, the change process to be simulated for showing the virtual scene of the tourist attraction may be divided into four change stages corresponding to four seasons, respectively, and the natural events corresponding to the four seasons are usually different. Corresponding simulation strategies can be set for the four change stages in advance according to actual needs, and the simulation strategies corresponding to different change stages are different (specifically, event sets in the simulation strategies corresponding to different change stages are different), so that targeted simulation is realized, and the simulation effect is improved. The four variation phases include the first variation phase and the second variation phase.
In another specific application scenario, the deterioration process of the fresh food can be displayed to the user on the fresh food e-commerce platform. Since the deterioration process of the fresh food is divided into a plurality of change stages, the types of microorganisms, chemical reactions and the like involved in each stage are different, and in order to realize targeted simulation and improve the simulation effect, corresponding simulation strategies can be set for the plurality of change stages according to actual needs, and the simulation strategies corresponding to different change stages are different (specifically, the types of microorganisms and/or chemical reactions in the simulation strategies corresponding to different change stages are different).
In the above 128, in the first simulation process and the second simulation process, the virtual scene is generated and displayed according to the simulation result, so as to display the dynamic change process of the commodity to the user.
In this embodiment, the change process of the virtual scene for displaying the commodity is simulated by adopting different simulation strategies at different change stages, so that the simulation pertinence is improved, the generation effect can be further improved, and the generated virtual scene is more natural and harmonious.
Taking a tourist attraction as an example, different tourist attractions may have different suitable periods of time for tourism, such as: the time period for yellow mountain to be suitable for tourism is 3 to 5 months per year, and the time period for fragrant mountain to be suitable for tourism is 10 to 11 months per year. In order to highlight the characteristics of the tourist attraction, in the process of showing the virtual scene, the scenery in the time period suitable for tourism in the tourist attraction can be highlighted. Specifically, the "displaying the virtual scene" in 128 includes: highlighting the virtual scene generated based on simulation results relating to a specified phase of change. The designated change stage may be preset, taking the commodity as a tourist attraction as an example, and the designated change stage may correspond to a suitable tourist time period of the tourist attraction. The "highlighting" can be realized by means of enlarged display, highlighted display and the like.
In addition, in actual application, different rendering modes can be selected for different users to render and display the virtual scene, so that the user experience is improved. In an example, the different rendering modes may include, but are not limited to: 3D rendering and 2D rendering. Specifically, the "displaying the virtual scene" in 128 includes: acquiring a user type requesting to display the virtual scene; determining a rendering mode according to the user type; and rendering and displaying the virtual scene according to the rendering mode. The user types may specifically include, but are not limited to: mobile traffic user type (2G, 3G), wifi user type. Because 3D rendering needs to consume a large amount of network traffic resources, a 2D rendering mode can be adopted for the type of mobile traffic users; for wifi user types, a 3D rendering mode may be employed.
Fig. 4 shows a block diagram of a virtual scene generation apparatus according to an embodiment of the present application. As shown in fig. 4, the apparatus includes:
a first obtaining module 401, configured to obtain a first simulation policy and a second simulation policy corresponding to the virtual scene, where the virtual scene corresponds to a first change phase corresponding to the first simulation policy and a second change phase corresponding to the second simulation policy;
a first execution module 402 for executing a first simulation process relating to the first variation phase based on the first simulation strategy; and further configured to execute a second simulation procedure with respect to the second variation phase based on the second simulation strategy.
Optionally, the apparatus may further include:
the second acquisition module is used for acquiring the first information; the first information relates to simulation convergence rates corresponding to respective scene elements of the virtual scene; the larger the simulation convergence speed is, the faster the first data information of the corresponding scene element converges to a stable state in the third simulation process; the same simulation strategy is adopted in the third simulation process;
a first determining module, configured to determine the first simulation policy and the second simulation policy, where the first simulation policy and the second simulation policy are set in combination with the first information.
Optionally, the first simulation policy further includes a selected probability corresponding to each event in the first event set; the above apparatus may further include:
and a second execution module, configured to execute an unequal probability sampling according to the first time point and by combining with the selected probability corresponding to each event in the first event set, so as to extract an event from the first event set as the selected event.
Optionally, the apparatus may further include:
and the first generation module is used for generating the virtual scene according to the respective first data information of the plurality of scene elements obtained by simulation.
Optionally, the apparatus may further include:
the first construction module is used for constructing a multilayer ordered data structure according to the hierarchical relationship of the scene elements; wherein each layer of data in the multi-layer ordered data structure is used for representing first data information of a corresponding scene element;
correspondingly, the first generating module is specifically configured to: and generating the virtual scene according to each layer of data in the multilayer ordered data structure obtained by simulation.
Optionally, the apparatus may further include:
the third acquisition module is used for acquiring a terrain height map of the virtual scene;
correspondingly, the first execution module is specifically configured to:
and executing a first simulation process related to the first change stage based on the first simulation strategy in combination with the terrain height map.
Optionally, the apparatus may further include:
and the initialization module is used for initializing the respective first data information of the scene elements to be used as a simulation basis.
Here, it should be noted that: the virtual scene generation apparatus provided in the foregoing embodiments may implement the technical solutions described in the foregoing method embodiments, and the specific implementation principle of each module may refer to the corresponding content in the foregoing method embodiments, which is not described herein again.
Fig. 5 shows a block diagram of a virtual scene generation apparatus according to an embodiment of the present application. As shown in fig. 5, the apparatus includes:
a third executing module 501, configured to execute, for a virtual scene to be generated, a simulation process related to a change of the virtual scene to obtain first data information of a first scene element of the virtual scene;
a fourth obtaining module 502, configured to obtain second data information of the first scene element;
a first modification module 503, configured to modify first data information of the first scene element according to second data information of the first scene element, to obtain target data information of the first scene element;
a second generating module 504, configured to generate the virtual scene in combination with the target data information of the first scene element.
Optionally, the third executing module 501 is specifically configured to:
acquiring a first simulation strategy and a second simulation strategy corresponding to the virtual scene, wherein the virtual scene corresponds to a first change stage corresponding to the first simulation strategy and a second change stage corresponding to the second simulation strategy;
executing a first simulation process with respect to the first variation phase based on the first simulation strategy;
executing a second simulation process with respect to the second variation phase based on the second simulation strategy.
Here, it should be noted that: the virtual scene generation apparatus provided in the foregoing embodiments may implement the technical solutions described in the foregoing method embodiments, and the specific implementation principle of each module may refer to the corresponding content in the foregoing method embodiments, which is not described herein again.
Fig. 6 shows a block diagram of an interface interaction apparatus according to an embodiment of the present application. As shown in fig. 6, the apparatus includes:
a first display module 601, configured to display a virtual scene obtained through simulation on a user interface; the virtual scene obtained through simulation is generated by combining first data information of the first scene element obtained through simulation;
a second modification module 602, configured to, in response to second data information of the first scene element input by a user on the user interface, modify, according to the second data information of the first scene element, first data information of the first scene element obtained through simulation, so as to obtain target data information of the first scene element;
the first display module 601 is further configured to display the modified virtual scene on the user interface; the modified virtual scene is generated in combination with the target data information of the first scene element.
Here, it should be noted that: the interface interaction device provided in the above embodiments may implement the technical solutions described in the above method embodiments, and the specific implementation principle of each module may refer to the corresponding content in the above method embodiments, which is not described herein again.
Fig. 7 shows a schematic structural diagram of an electronic device according to an embodiment of the present application. As shown, the electronic device includes a memory 1101 and a processor 1102. The memory 1101 may be configured to store other various data to support operations on the electronic device. Examples of such data include instructions for any application or method operating on the electronic device. The memory 1101 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
The memory is used for storing programs;
the processor 1102 is coupled to the memory 1101, and configured to execute the program stored in the memory 1101, so as to implement the virtual scene generation, the interface interaction, or the merchandise display method in the foregoing embodiments.
Further, as shown in fig. 7, the electronic device further includes: communication components 1103, display 1104, power components 1105, audio components 1106, and the like. Only some of the components are schematically shown in fig. 7, and the electronic device is not meant to include only the components shown in fig. 7.
Accordingly, embodiments of the present application further provide a computer-readable storage medium storing a computer program, where the computer program, when executed by a computer, can implement the steps or functions of the virtual scene generation, the interface interaction, and the merchandise display method provided in the foregoing embodiments.
The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware. With this understanding in mind, the above-described technical solutions may be embodied in the form of a software product, which can be stored in a computer-readable storage medium such as ROM/RAM, magnetic disk, optical disk, etc., and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the methods described in the embodiments or some parts of the embodiments.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.

Claims (28)

1. A virtual scene generation method is characterized by comprising the following steps:
acquiring a first simulation strategy and a second simulation strategy corresponding to the virtual scene, wherein the virtual scene corresponds to a first change stage corresponding to the first simulation strategy and a second change stage corresponding to the second simulation strategy;
executing a first simulation process with respect to the first variation phase based on the first simulation strategy;
executing a second simulation process with respect to the second variation phase based on the second simulation strategy.
2. The method of claim 1, further comprising:
acquiring first information; the first information relates to simulation convergence rates corresponding to respective scene elements of the virtual scene; the larger the simulation convergence speed is, the faster the first data information of the corresponding scene element converges to a stable state in the third simulation process; the same simulation strategy is adopted in the third simulation process;
determining the first simulation strategy and the second simulation strategy, wherein the first simulation strategy and the second simulation strategy are set in combination with the first information.
3. The method of claim 2, wherein determining the first simulation strategy and the second simulation strategy comprises:
and determining a first event set causing the virtual scene change in the first simulation strategy and a second event set causing the virtual scene change in the second simulation strategy by combining the first information.
4. The method of claim 3, wherein determining, in conjunction with the first information, a first set of events in the first simulation strategy that caused the virtual scene change and a second set of events in the second simulation strategy that caused the virtual scene change comprises:
dividing the plurality of scene elements into a plurality of convergence levels in conjunction with the first information; wherein, the higher the convergence level is, the higher the simulation convergence speed of the corresponding scene element is;
dividing a change process to be simulated related to the virtual scene into a plurality of change stages; the number of the plurality of convergence levels is the same as the number of the plurality of variation phases; the number of the plurality of variation phases is n; the plurality of variation phases includes the first variation phase and the second variation phase;
determining an event set in a simulation strategy corresponding to an ith change stage according to at least one event which affects the convergence of scene elements at the ith convergence level and below convergence levels in a plurality of preset events;
wherein i is an integer and has a value range of [1, n ]; the ith convergence level is higher than the (i + 1) th convergence level, and the (i + 1) th variation phase precedes the (i + 1) th variation phase.
5. The method of any of claims 2 to 4, wherein performing a first simulation procedure with respect to the first variation phase based on the first simulation strategy comprises:
acquiring a first event set in the first simulation strategy;
executing a simulation process of generating a selected event at a selected place aiming at a first time point in the first change phase;
adjusting respective first data information of the plurality of scene elements according to a simulation result of the selected event occurring at the selected place;
wherein the selected place is selected from the virtual scene for the first point in time; the selected event is selected from the first set of events for the first point in time.
6. The method of claim 5, wherein the first simulation strategy further includes a selected probability corresponding to each event in the first set of events;
the method further comprises the following steps:
and aiming at the first time point, combining the selected probability corresponding to each event in the first event set, and executing unequal probability sampling to extract one event from the first event set as the selected event.
7. The method according to claim 5, wherein for a first point in time within the first variation phase, performing a simulation process of occurrence of a selected event at a selected location, comprises:
acquiring state information of the selected place at the first time point; wherein the state information is related to the selected event;
determining the occurrence probability of the selected event at the selected place according to the state information and the simulation parameters corresponding to the selected event;
and when the occurrence probability is greater than or equal to a first preset threshold value, executing a simulation process of the selected event at the selected place aiming at the first time point.
8. The method according to claim 5, wherein the first simulation strategy further includes simulation parameters corresponding to each event in the first event set;
for a first time point within the first variation phase, executing a simulation process of occurrence of a selected event at a selected location, including:
and aiming at the first time point, combining the simulation parameters corresponding to the selected event in the first simulation strategy, and executing the simulation process of the selected event at the selected place.
9. The method of any of claims 2 to 4, further comprising:
and generating the virtual scene according to the respective first data information of the scene elements obtained by simulation.
10. The method of claim 9, wherein the plurality of scene elements includes a first scene element;
generating the virtual scene according to the respective first data information of the plurality of scene elements obtained by simulation, including:
acquiring second data information of the first scene element;
according to the second data information of the first scene element, modifying the first data information of the first scene element obtained by simulation to obtain target data information of the first scene element;
and generating the virtual scene by combining the target data information of the first scene element.
11. The method of claim 10, wherein the second data information of the first scene element is determined in one of the following ways:
receiving second data information of the first scene element, which is edited by a user on a manual editing interface;
and generating second data information of the first scene element according to the remote sensing data of the real area corresponding to the virtual scene.
12. The method of claim 9, further comprising:
constructing a multilayer ordered data structure according to the hierarchical relationship of the scene elements; wherein each layer of data in the multi-layer ordered data structure is used for representing first data information of a corresponding scene element;
generating the virtual scene according to the respective first data information of the plurality of scene elements obtained by simulation, including:
and generating the virtual scene according to each layer of data in the multilayer ordered data structure obtained by simulation.
13. The method of claim 9, wherein the virtual scene is a virtual natural scene; the first phase of variation is related to natural evolution;
the method further comprises the following steps:
acquiring a terrain height map of the virtual scene;
executing a first simulation process with respect to the first variation phase based on the first simulation strategy, including:
and executing a first simulation process related to the first change stage based on the first simulation strategy in combination with the terrain height map.
14. The method according to claim 13, wherein generating the virtual scene according to the simulated first data information of each of the scene elements comprises:
and generating the virtual scene according to the respective first data information of the plurality of scene elements obtained by simulation and the terrain height map.
15. The method according to claim 14, wherein generating the virtual scene according to the simulated first data information of each of the scene elements and the terrain height map comprises:
obtaining scene element models corresponding to the scene elements and terrain materials corresponding to the scene elements;
and inputting a scene element model corresponding to each of the plurality of scene elements, a terrain material corresponding to each of the plurality of scene elements, first data information of each of the plurality of scene elements obtained through simulation, and the terrain height map into a rendering engine to generate the virtual scene.
16. The method of claim 13, wherein the natural evolution comprises a terrain evolution and/or an ecological evolution.
17. The method of any of claims 2 to 4, further comprising:
initializing respective first data information of the plurality of scene elements as a simulation basis.
18. The method of any of claims 2 to 4, wherein the first data information comprises a first density map.
19. A virtual scene generation method is characterized by comprising the following steps:
executing a simulation process related to the change of the virtual scene aiming at the virtual scene to be generated to obtain first data information of a first scene element of the virtual scene;
acquiring second data information of the first scene element;
according to the second data information of the first scene element, modifying the first data information of the first scene element to obtain target data information of the first scene element;
and generating the virtual scene by combining the target data information of the first scene element.
20. The method of claim 19, wherein the second data information of the first scene element is determined in one of the following ways:
receiving second data information of the first scene element, which is edited by a user on a manual editing interface;
and generating second data information of the first scene element according to the remote sensing data of the real area corresponding to the virtual scene.
21. The method of claim 19, wherein executing a simulation process regarding the virtual scene change for the virtual scene to be generated comprises:
acquiring a first simulation strategy and a second simulation strategy corresponding to the virtual scene, wherein the virtual scene corresponds to a first change stage corresponding to the first simulation strategy and a second change stage corresponding to the second simulation strategy;
executing a first simulation process with respect to the first variation phase based on the first simulation strategy;
executing a second simulation process with respect to the second variation phase based on the second simulation strategy.
22. An interface interaction method, comprising:
displaying a virtual scene obtained by simulation on a user interface; the virtual scene obtained through simulation is generated by combining first data information of the first scene element obtained through simulation;
responding to second data information of the first scene element input by a user on the user interface, and correcting the first data information of the first scene element obtained by simulation according to the second data information of the first scene element to obtain target data information of the first scene element;
displaying the corrected virtual scene on the user interface; the modified virtual scene is generated in combination with the target data information of the first scene element.
23. An electronic device, comprising: a memory and a processor, wherein,
the memory is used for storing programs;
the processor, coupled with the memory, to execute the program stored in the memory to:
acquiring a first simulation strategy and a second simulation strategy corresponding to the virtual scene, wherein the virtual scene corresponds to a first change stage corresponding to the first simulation strategy and a second change stage corresponding to the second simulation strategy;
executing a first simulation process with respect to the first variation phase based on the first simulation strategy;
executing a second simulation process with respect to the second variation phase based on the second simulation strategy.
24. An electronic device, comprising: a memory and a processor, wherein,
the memory is used for storing programs;
the processor, coupled with the memory, to execute the program stored in the memory to:
executing a simulation process related to the change of the virtual scene aiming at the virtual scene to be generated to obtain first data information of a first scene element of the virtual scene;
acquiring second data information of the first scene element;
according to the second data information of the first scene element, modifying the first data information of the first scene element to obtain target data information of the first scene element;
and generating the virtual scene by combining the target data information of the first scene element.
25. An electronic device, comprising: a memory and a processor, wherein,
the memory is used for storing programs;
the processor, coupled with the memory, to execute the program stored in the memory to:
displaying a virtual scene obtained by simulation on a user interface; the virtual scene obtained through simulation is generated by combining first data information of the first scene element obtained through simulation;
responding to second data information of the first scene element input by a user on the user interface, and correcting the first data information of the first scene element obtained by simulation according to the second data information of the first scene element to obtain target data information of the first scene element;
displaying the corrected virtual scene on the user interface; the modified virtual scene is generated in combination with the target data information of the first scene element.
26. A method of displaying merchandise, comprising:
determining a virtual scene for displaying the commodity;
acquiring a first simulation strategy and a second simulation strategy corresponding to the virtual scene, wherein the virtual scene corresponds to a first change stage corresponding to the first simulation strategy and a second change stage corresponding to the second simulation strategy;
executing a first simulation process with respect to the first variation phase based on the first simulation strategy;
executing a second simulation process with respect to the second variation phase based on the second simulation strategy;
and generating and displaying the virtual scene according to a simulation result in the first simulation process and the second simulation process.
27. The method of claim 26, wherein presenting the virtual scene comprises:
highlighting the virtual scene generated based on simulation results relating to a specified phase of change.
28. The method of claim 26, wherein presenting the virtual scene comprises:
acquiring a user type requesting to display the virtual scene;
determining a rendering mode according to the user type;
and rendering and displaying the virtual scene according to the rendering mode.
CN202010486656.4A 2020-06-01 2020-06-01 Virtual scene generation method, interface interaction method, commodity display method and equipment Active CN113516744B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010486656.4A CN113516744B (en) 2020-06-01 2020-06-01 Virtual scene generation method, interface interaction method, commodity display method and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010486656.4A CN113516744B (en) 2020-06-01 2020-06-01 Virtual scene generation method, interface interaction method, commodity display method and equipment

Publications (2)

Publication Number Publication Date
CN113516744A true CN113516744A (en) 2021-10-19
CN113516744B CN113516744B (en) 2022-09-27

Family

ID=78060992

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010486656.4A Active CN113516744B (en) 2020-06-01 2020-06-01 Virtual scene generation method, interface interaction method, commodity display method and equipment

Country Status (1)

Country Link
CN (1) CN113516744B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101908232A (en) * 2010-07-30 2010-12-08 重庆埃默科技有限责任公司 Interactive scene simulation system and scene virtual simulation method
CN107291222A (en) * 2017-05-16 2017-10-24 阿里巴巴集团控股有限公司 Interaction processing method, device, system and the virtual reality device of virtual reality device
US20190067939A1 (en) * 2016-05-18 2019-02-28 China Electric Power Research Institute Company Limited Multi-time-scale digital/analog hybrid simulation system and method for power distribution network and storage medium
CN109785352A (en) * 2018-12-21 2019-05-21 广东工业大学 A kind of intelligent and high-efficiency airborne radar point cloud analysis method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101908232A (en) * 2010-07-30 2010-12-08 重庆埃默科技有限责任公司 Interactive scene simulation system and scene virtual simulation method
US20190067939A1 (en) * 2016-05-18 2019-02-28 China Electric Power Research Institute Company Limited Multi-time-scale digital/analog hybrid simulation system and method for power distribution network and storage medium
CN107291222A (en) * 2017-05-16 2017-10-24 阿里巴巴集团控股有限公司 Interaction processing method, device, system and the virtual reality device of virtual reality device
CN109785352A (en) * 2018-12-21 2019-05-21 广东工业大学 A kind of intelligent and high-efficiency airborne radar point cloud analysis method

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
ALEJANDRO LUJAN 等: "《Generation of rule-based adaptive strategies for a collaborative virtual simulation environment》", 《2008 IEEE INTERNATIONAL WORKSHOP ON HAPTIC AUDIO VISUAL ENVIRONMENTS AND GAMES》 *
张小威等: "红外多谱段场景图像合成方法", 《系统仿真技术》 *
林丹丹等: "红外场景产生技术及应用研究", 《红外技术》 *
范晓磊等: "多尺度活动网格在云场景仿真中的应用", 《中国图象图形学报》 *
金强等: "基于多源地形数据融合的配电网工程场景仿真研究", 《供用电》 *

Also Published As

Publication number Publication date
CN113516744B (en) 2022-09-27

Similar Documents

Publication Publication Date Title
Paschalis et al. Urban forests as main regulator of the evaporative cooling effect in cities
Normand et al. A greener Greenland? Climatic potential and long-term constraints on future expansions of trees and shrubs
US7269539B2 (en) Dynamic weather simulation
US7349830B2 (en) Weather profiles
JP2005266791A (en) Development tool for defining attributes within multi-dimensional space
Perry et al. A GIS‐supported model for the simulation of the spatial structure of wildland fire, Cass Basin, New Zealand
CN105184017A (en) OpenSceneGraph-based real-time battlefield simulation system and method
Werner et al. Effect of changing vegetation and precipitation on denudation–Part 1: Predicted vegetation composition and cover over the last 21 thousand years along the Coastal Cordillera of Chile
CN111783360A (en) High-resolution land utilization and forest landscape process coupling simulation system and method
Chacón-Moreno et al. Impacts of global change on the spatial dynamics of treeline in Venezuelan Andes
Fomin et al. Genetic forest typology as a scientific and methodological basis for environmental studies and forest management
Chung et al. Using urban effect corrected temperature data and a tree phenology model to project geographical shift of cherry flowering date in South Korea
Van Soesbergen Impacts of climate change on water resources of global dams
CN113516744B (en) Virtual scene generation method, interface interaction method, commodity display method and equipment
Bhelawe et al. Rainfall variability in Chhattisgarh State using GIS
de Wergifosse Simulating tree growth response to climate change in structurally-complex oak and beech stands across Europe
Nemeth et al. Multi-variable verification of hydrological processes in the upper North Saskatchewan River basin, Alberta, Canada
De Groote et al. ORCHIDEE-SRC v1. 0: an extension of the land surface model ORCHIDEE for simulating short rotation coppice poplar plantations
Priya et al. Modeling spatial crop production: A GIS approach
Verbruggen et al. Mapping Sahelian ecosystem vulnerability to vegetation collapse: Vegetation model optimization
Favorskaya et al. Modelling of forest ecosystems
Robinson Effects of land-use policy, forest fragmentation, and residential parcel size on land-cover and carbon storage in Southeastern Michigan
Bequet Environmental determinants of the temporal and spatial variability in leaf area index of Fagus sylvatica L., Quercus robur L. and Pinus sylvestris L.
Qi Human-induced biospheric change and the global carbon cycle: a spatial modeling approach and its application to tropical Asia
Lapides et al. Inclusion of bedrock vadose zone in dynamic global vegetation models is key for simulating vegetation structure and function

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant