CN114367113A - Method, apparatus, medium, and computer program product for editing virtual scene - Google Patents

Method, apparatus, medium, and computer program product for editing virtual scene Download PDF

Info

Publication number
CN114367113A
CN114367113A CN202210036337.2A CN202210036337A CN114367113A CN 114367113 A CN114367113 A CN 114367113A CN 202210036337 A CN202210036337 A CN 202210036337A CN 114367113 A CN114367113 A CN 114367113A
Authority
CN
China
Prior art keywords
scene
scene element
virtual
editing
edited
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210036337.2A
Other languages
Chinese (zh)
Inventor
周衍
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Lilith Technology Corp
Original Assignee
Shanghai Lilith Technology Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Lilith Technology Corp filed Critical Shanghai Lilith Technology Corp
Priority to CN202210036337.2A priority Critical patent/CN114367113A/en
Publication of CN114367113A publication Critical patent/CN114367113A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/63Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor by the player, e.g. authoring using a level editor
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • A63F2300/663Methods for processing data by generating or executing the game program for rendering three dimensional images for simulating liquid objects, e.g. water, gas, fog, snow, clouds

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The present invention relates to the field of three-dimensional modeling, and more particularly, to a method, apparatus, medium, and computer program product for editing a virtual scene. The method of the invention comprises the following steps: determining a first scene element and a second scene element in a virtual scene to be edited, wherein the first scene element and the second scene element have different phase attributes; setting structural attributes of the first scene element and the second scene element respectively, and determining the positions of the first scene element and the second scene element in the virtual scene to be edited and/or the relative positions between the first scene element and the second scene element, wherein the first scene element and the second scene element at least partially overlap in the space of the virtual scene to be edited. The invention can take the scene elements with different phase attributes as different objects, can independently edit the scene elements with different phase attributes, and can render and display the scene elements in a superposition way.

Description

Method, apparatus, medium, and computer program product for editing virtual scene
Technical Field
The present invention relates to the field of three-dimensional modeling, and more particularly, to a method, apparatus, medium, and computer program product for editing a virtual scene.
Background
An existing volume Terrain (Volumetric Terrain) editor puts a liquid material and a solid material in the same three-dimensional object for editing, and any voxel (Cell) in a space only has one material, which can cause the following technical problems.
First, the solid material is easily destroyed when editing the liquid material. As described above, since any voxel in space can only have one material, the solid material in the surrounding voxels is inadvertently replaced with the liquid material when editing the liquid material.
Second, where the liquid material and the solid material meet, an unreasonable shape (e.g., a liquid level unevenness) is likely to form. Editors typically use algorithms such as the moving Box (Marching Box) to automatically calculate a smooth surface shape for a particular voxel based on the fill level value of the material in that voxel to fit the surface of surrounding voxels, i.e., the surface shape of a voxel is affected by surrounding voxels, resulting in a solid voxel affecting the surface shape of a liquid voxel, whereas in practice the liquid surface should always be flat.
Third, during game operation, for example, holes are punched at the bottom of the liquid material, and the holes are not automatically filled with the liquid material.
Disclosure of Invention
An object of the present invention is to provide a method, an apparatus, a medium, and a computer program product for editing a virtual scene, which can take scene elements of different phase attributes as different objects, can independently edit scene elements of different phase attributes, and can render and display these scene elements overlappingly.
The invention discloses a method for editing a virtual scene, which is used for electronic equipment and comprises the following steps:
determining a first scene element and a second scene element in a virtual scene to be edited, wherein the first scene element and the second scene element have different phase attributes;
setting structural attributes of the first scene element and the second scene element respectively, and determining the positions of the first scene element and the second scene element in the virtual scene to be edited and/or the relative positions between the first scene element and the second scene element, wherein the first scene element and the second scene element at least partially overlap in the space of the virtual scene to be edited.
Optionally, the method further comprises:
rendering and displaying the first scene element and the second scene element in superposition.
Optionally, the rendering and displaying the first scene element and the second scene element in superposition comprises:
superimposing the first scene element and the second scene element, rendering the superimposed first scene element and second scene element, and displaying the rendered first scene element and second scene element.
Optionally, the rendering and displaying the first scene element and the second scene element in superposition comprises:
rendering the first scene element and the second scene element, superimposing the rendered first scene element and the rendered second scene element, and displaying the superimposed first scene element and the superimposed second scene element.
Optionally, the method further comprises:
setting display attributes of the first scene element and the second scene element respectively;
superimposing the first scene element and the second scene element based on display attributes of the first scene element and the second scene element;
rendering the first and second superimposed scene elements; and
displaying the rendered first scene element and the rendered second scene element.
Optionally, the method further comprises:
setting display attributes of the first scene element and the second scene element respectively;
superimposing the rendered first scene element and the rendered second scene element based on display attributes of the first scene element and the second scene element; and
displaying the first scene element and the second scene element after being superimposed.
Optionally, when one of the first scene element and the second scene element is edited, the other scene element is hidden.
Optionally, the phase property comprises at least one of physical form, flowability, viscosity, and compressibility.
Optionally, the structural attribute comprises at least one of a shape and a size.
Optionally, the display attribute comprises at least one of color and transparency.
The invention discloses a system for editing virtual scenes, which comprises:
the system comprises a confirming unit, a processing unit and a processing unit, wherein the confirming unit is used for confirming a first scene element and a second scene element in a virtual scene to be edited, and the first scene element and the second scene element have different phase attributes;
an editing unit, configured to set structural attributes of the first scene element and the second scene element, respectively, and determine a position of the first scene element and the second scene element in the virtual scene to be edited and/or a relative position between the first scene element and the second scene element, where the first scene element and the second scene element at least partially overlap in a space of the virtual scene to be edited.
An electronic device includes a processor and a memory storing computer-executable instructions, the processor being configured to execute the instructions to implement a method of editing a virtual scene.
A computer-readable storage medium having stored thereon computer-executable instructions for execution by a processor to implement a method of editing a virtual scene is disclosed.
A computer program product comprising computer executable instructions for execution by a processor to implement a method of editing a virtual scene is disclosed.
Compared with the prior art, the implementation mode of the invention has the main differences and the effects that:
the scene elements with different phase attributes can be used as different objects, so that when the scene elements are arranged in the space of the virtual scene to be edited, a plurality of phase attributes can exist for any voxel in the space.
The method and the device can independently edit the scene elements with different phase attributes, so that when the scene element with one phase attribute is edited, the scene element with the other phase attribute is not influenced, and the situations of wrong replacement and the like are avoided.
The invention can display the edited virtual scene and the first scene element and the second scene element in the edited virtual scene in the display area of the editor, the joint place of the two scene elements can not form unreasonable shape when the two scene elements are displayed, and after some scene elements are reset, the new virtual scene and the scene elements in the virtual scene can be automatically displayed.
Drawings
FIG. 1 is a schematic diagram of editing a virtual scene according to the prior art;
FIG. 2 is a schematic diagram of editing a virtual scene in accordance with the present invention;
FIG. 3 is a flow diagram of a method of editing a virtual scene in accordance with the present invention;
FIG. 4 is a block diagram of a system for editing a virtual scene in accordance with the present invention;
FIG. 5 is another flow diagram of a method of editing a virtual scene in accordance with the present invention;
FIG. 6 is another block diagram of a system for editing a virtual scene in accordance with the present invention;
fig. 7 is a hardware configuration block diagram of an electronic device implementing the method of editing a virtual scene according to the present invention.
Detailed Description
In order to make the purpose and technical solution of the embodiments of the present invention clearer, the technical solution of the embodiments of the present invention will be clearly and completely described below with reference to the drawings of the embodiments of the present invention. It is to be understood that the embodiments described are only a few embodiments of the present invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the described embodiments of the invention without any inventive step, are within the scope of protection of the invention.
Fig. 1 is a schematic diagram of editing a virtual scene according to the prior art.
Editing a virtual scene to be edited in an editor, and editing each voxel in a space in the scene, as shown in fig. 1, wherein the virtual scene to be edited is a game terrain scene, a first number of voxels are selected and set to be a solid material, the first number of voxels are formed into a mountain 101, the mountain 101 extends upwards from the bottom of the game terrain scene, and a plurality of peaks 101A and a plurality of valleys 101B are formed. A second number of voxels is selected to be provided as the liquid material, the second number of voxels is formed as a plurality of bodies of water 102, a bottom 102A of each body of water 102 is shape-matched to a bottom of a corresponding valley 101B, and the top has a horizontal surface 102B. A mountain 101 and a plurality of bodies of water 102 are rendered and displayed, respectively.
In the prior art, each voxel in a space is edited, any voxel in the space can only have one material, and a mountain 101 and a plurality of water bodies 102 are independent. When a second number of voxels are selected to be liquid, for example, a certain voxel above the bottom of the valley 101B is selected to be liquid, some voxels in the bottom of the valley 101B may be inadvertently set to be liquid, thereby destroying the overall shape of the valley 101B.
Secondly, for a voxel at the junction of the horizontal surface 102B and the upper part of the valley 101B, which has a steep upper part of the valley 101B at one end and a horizontal surface 102B at the other end, the surface shape of the voxel will be a slanted surface to join the upper part of the valley 101B and the horizontal surface 102B, respectively. When the voxel is set to a liquid material, an inclined water surface will occur, however in practice the surface of the body of water 102 should always be flat.
In addition, if a hole is punched in the bottom of a certain valley 101B, the corresponding water 102 is not automatically filled into the hole, but a third number of voxels in the hole is selected to be set as a liquid material, and the voxels are re-rendered and displayed.
In view of the above problems, embodiments of the present application provide a method for editing a virtual scene, and according to the method for editing a virtual scene of the embodiments of the present application, scene elements with different phase attributes can be used as different objects, scene elements with different phase attributes can be independently edited, and the scene elements can be rendered and displayed in a superimposed manner.
Fig. 2 is a schematic diagram of editing a virtual scene according to the present invention.
Editing a virtual scene to be edited and a first scene element and a second scene element in the virtual scene in an editor, wherein the virtual scene to be edited is a game terrain scene, and determining a mountain 201 and a water 202 in the scene, as shown in FIG. 2. The mountain 201 and the water 202 have different phase properties, the mountain 201 being a solid and the water 202 being a liquid. The invention can take the solid mountain 201 and the liquid water 202 as two different objects, so that when the mountain 201 and the water 202 are arranged in the space of the game terrain scene, solid or liquid can exist in any voxel in the space. For voxels with two phase attributes, one of the phase attributes, such as the solid attribute, may be displayed according to a preset priority during rendering of the display.
The structural attributes of the mountain 201 and the water 202 are set, respectively. Optionally, the structural attribute comprises at least one of a shape and a size. As shown in fig. 2, the mountain 201 extends upward from the bottom of the game terrain scene, and forms a plurality of peaks 201A and a plurality of valleys 201B. The bottom of the body of water 202 is an arcuate bottom 202A and the top has a plurality of levels 202B.
Determining the position of the mountain 201 and the water 202 in the game terrain scene and/or the relative position between the mountain 201 and the water 202, the mountain 201 and the water 202 at least partially overlapping in the space of the game terrain scene. As shown in fig. 2, most of the water body 202 is inside the mountain 201 to form groundwater, and a portion having a plurality of levels 202B covers a plurality of valleys 201B to form a lake. The method can independently edit the solid mountain 201 and the liquid water body 202, so that the liquid water body 202 is not influenced when the solid mountain 201 is edited, and wrong replacement and the like are avoided.
Mountain 201 and water 202 are rendered and displayed in superposition. As shown in fig. 2, the surface of a mountain 201A and a horizontal plane 202B covering a valley 201B are displayed based on the viewing angle of the game player, without displaying the mountain 201 and the inside of the water body 202. The invention can display the edited game terrain scene, the mountain 201 and the water 202 in the edited game terrain scene in the display area of the editor, and when the mountain 201 and the water 202 are displayed, the connecting place of the mountain 201 and the water 202 can not form unreasonable shapes, and the lake can keep a horizontal surface.
After the structural attributes of the mountain 201 are reset (for example, one valley 201B is extended downward, and the water 202 covers the valley 201B extended downward), the reset mountain 201 and the water 202 are rendered and displayed in superposition so that the valley 201B extended downward is covered by the water 202, thereby presenting an effect of making a hole at the bottom of a lake, in which the water 202 can be automatically filled. The invention can automatically display the new virtual scene and the scene elements therein after resetting some scene elements.
Fig. 3 is a flow chart of a method of editing a virtual scene according to the present invention. As shown in fig. 3, the first embodiment includes:
in step S301, a first scene element and a second scene element in a virtual scene to be edited are determined, where the first scene element and the second scene element have different phase attributes.
Alternatively, the virtual scene to be edited may be a 3D virtual scene. The first and second scene elements may be 3D virtual objects for being arranged in a 3D virtual space to form a complete virtual scene. It is understood that any number of scene elements may be included in the 3D virtual space, where the first scene element and the second scene element are taken as an example, and other scene elements are processed similarly to the first scene element and the second scene element. The method comprises the steps of setting a display area and an editing area in an editor, displaying a virtual scene to be edited and a first scene element and a second scene element in the virtual scene to be edited in the display area, and setting a first scene element tab and a second scene element tab in the editing area so as to edit the first scene element and the second scene element in the virtual scene to be edited. For example, as shown in fig. 2, the editor may be a game terrain editor, the virtual scene to be edited is a game terrain scene, and the first scene element and the second scene element are a mountain 201 and a water 202, respectively.
Optionally, for the determined first scene element and second scene element, a first scene element model and a second scene element model are respectively established. Wherein the first scene element model and the second scene element model store phase attributes of the first scene element and the second scene element, respectively, and are presented as phase attribute options in the first scene element tab and the second scene element tab, respectively. Scene elements with different phase attributes are independent of each other in the model layer and the data layer and belong to different objects.
Optionally, the phase property comprises at least one of physical form, flowability, viscosity, and compressibility. Wherein the physical form includes a liquid state, a solid state, and a gaseous state, and the first scene element and the second scene element may have different physical forms. For example, as described above, the first scene element and the second scene element are a mountain (solid state) and a body of water (liquid state), respectively. Similarly, the first scene element and the second scene element may also have different flowability, viscosity, and/or compressibility. For example, the first scene element and the second scene element are a mountain and a quicksand, respectively.
Through step S301, the present invention may use scene elements with different phase attributes as different objects, so that when these scene elements are set in the space of the virtual scene to be edited, there may be multiple phase attributes for any one voxel in the space.
In step S302, structural attributes of the first scene element and the second scene element are respectively set, and a position of the first scene element and the second scene element in the virtual scene to be edited and/or a relative position between the first scene element and the second scene element are determined, where the first scene element and the second scene element may at least partially overlap in a space of the virtual scene to be edited. For example, as shown in fig. 2, a majority of the area of the body of water 202 is located inside the mountain 201 and another minority of the area of the body of water 202 floats outside the mountain 201 covering the valley 201B.
Optionally, the first scene element model and the second scene element model respectively store the structural properties of the first scene element and the second scene element and the positions of the first scene element and the second scene element in the virtual scene to be edited and/or the relative positions between the first scene element and the second scene element, and are respectively presented in the first scene element tab and the second scene element tab as structural property options and position options.
The shapes and the sizes of the first scene element and the second scene element can be respectively determined by operations such as zooming, stretching, increasing and decreasing, smoothing and brushing in the display area and/or by operations such as setting shape parameters and size parameters in the structure attribute options in the scene element tab.
The positions of the first scene element and the second scene element in the virtual scene to be edited and/or the relative positions of the first scene element and the second scene element can be respectively determined by operations such as moving in the display area and/or by operations such as setting position parameters of position options in the scene element tab.
The first scene element and the second scene element are at least partially overlapped in the space of the virtual scene to be edited, so that the first scene element and the second scene element can interact.
Optionally, while editing one of the first scene element and the second scene element, the other scene element is hidden. When a plurality of scene elements with different phase attributes are edited, by switching the first scene element tab, the second scene element tab and other operations, when one of the scene elements with specific phase attributes is edited, the other scene element with different phase attributes or all the remaining scene elements can be hidden, so that the scene elements to be edited can be displayed more clearly in the display area, and the problem that the scene elements except the scene elements to be edited are easily edited by misoperation in the prior art is solved. The operation of switching the first scene element tab and the second scene element tab is similar to the operation of selecting and displaying different layers in the graphic editing software, and it can be understood that the first scene element and the second scene element belong to different layers, and corresponding scene elements can be respectively rendered and displayed by selecting different layers.
Through the step S302, the present invention can independently edit scene elements with different phase attributes, so that when editing a scene element with one phase attribute, the scene element with another phase attribute will not be affected, and situations such as wrong replacement will not occur.
Fig. 4 is a block diagram of a system for editing a virtual scene according to the present invention. As shown in fig. 4, the second embodiment includes:
a determining unit 401, configured to determine a first scene element and a second scene element in a virtual scene to be edited, where the first scene element and the second scene element have different phase attributes.
Through the confirmation unit 401, the present invention may take scene elements with different phase attributes as different objects, so that when these scene elements are set in the space of the virtual scene to be edited, for any one voxel in the space, there may be a plurality of phase attributes.
An editing unit 402, configured to set structural attributes of the first scene element and the second scene element, respectively, and determine a position of the first scene element and the second scene element in the virtual scene to be edited and/or a relative position between the first scene element and the second scene element, where the first scene element and the second scene element at least partially overlap in a space of the virtual scene to be edited.
Through the editing unit 402, the invention can independently edit scene elements with different phase attributes, so that when editing a scene element with one phase attribute, the scene element with another phase attribute is not affected, and the situations of wrong replacement and the like are avoided.
The first embodiment is a method embodiment corresponding to the present embodiment, and the present embodiment can be implemented in cooperation with the first embodiment. The related technical details mentioned in the first embodiment are still valid in this embodiment, and are not described herein again in order to reduce repetition. Accordingly, the related-art details mentioned in the present embodiment can also be applied to the first embodiment.
Fig. 5 is another flowchart of a method of editing a virtual scene according to the present invention. As shown in fig. 5, the third embodiment includes, in addition to the steps described in the first embodiment:
in step S303, the first scene element and the second scene element are rendered and displayed in superposition. For example, after editing of a game terrain scene is completed by an editor based on the viewing angle of a game player, the surface of the mountain 201 and the surface of the water 202 covering the valley 201B are displayed in the displayed game terrain by rendering, and the mountain 201 and the inside of the water 202 are not displayed.
Optionally, one of the rendering and displaying schemes is to superimpose the first scene element and the second scene element, render the superimposed first scene element and second scene element, and display the rendered first scene element and second scene element. And further optionally setting display attributes of the first scene element and the second scene element, respectively; superimposing the first scene element and the second scene element based on the display attributes of the first scene element and the second scene element; rendering the first scene element and the second scene element after superposition; and displaying the rendered first scene element and second scene element.
The display attribute of the first scene element and the display attribute of the second scene element are subjected to logical operation (such as and, or, not and the like), the display attribute of the superposition element is determined by taking the superposition element as a whole, and then rendering and displaying are carried out based on the display attribute of the superposition element.
Optionally, another rendering and displaying scheme is to render the first scene element and the second scene element, superimpose the rendered first scene element and the rendered second scene element, and display the superimposed first scene element and the superimposed second scene element. And further optionally setting display attributes of the first scene element and the second scene element, respectively; superimposing the rendered first scene element and second scene element based on the display attributes of the first scene element and the second scene element; and displaying the first scene element and the second scene element after being overlapped.
The first scene element and the second scene element are used as individuals to be independently rendered, the two rendered scene elements are overlapped, and the two overlapped scene elements are displayed.
Optionally, the first scene element model and the second scene element model store display properties of the first scene element and the second scene element, respectively, and are presented as display property options in the first scene element tab and the second scene element tab, respectively.
Optionally, the display attribute comprises at least one of color and transparency. The color and transparency of the first and second scene elements may be set separately by operations such as setting the color parameter and transparency parameter in the display attribute options in the element tab.
Optionally, after the structural attributes of the first scene element and/or the second scene element, the position of the first scene element and the second scene element in the virtual scene to be edited and/or the relative position between the first scene element and the second scene element, and/or the display attributes of the first scene element and/or the second scene element are reset, the reset first scene element and second scene element are rendered and displayed in an overlapping manner. For example, the shape of the mountain 201 is rearranged to extend the valley 201B downward, and the water 202 covers the valley 201B downward, and by rendering and displaying the rearranged mountain 201 and the water 202 in superposition, so that the valley 201B downward is covered by the water 202, the effect that the hole is punched in the water bottom, the hole can be automatically filled with the water is exhibited.
Through step S303, the present invention may display the edited virtual scene and the first scene element and the second scene element therein in the display area of the editor, where the two scene elements are displayed without forming an unreasonable shape at the joint, and may automatically display a new virtual scene and the scene elements therein after resetting some scene elements.
Fig. 6 is another block diagram of a system for editing a virtual scene according to the present invention. As shown in fig. 6, the fourth embodiment includes, in addition to the units described in the second embodiment:
a display unit 403 for rendering and displaying the first scene element and the second scene element in superposition.
Through the display unit 403, the present invention can display the edited virtual scene and the first scene element and the second scene element therein in the display area of the editor, when the two scene elements are displayed, the place where they join does not form an unreasonable shape, and after some scene elements are reset, a new virtual scene and the scene elements therein can be automatically displayed.
The third embodiment is a method embodiment corresponding to the present embodiment, and the present embodiment can be implemented in cooperation with the third embodiment. The related technical details mentioned in the third embodiment are still valid in this embodiment, and are not described herein again in order to reduce repetition. Accordingly, the related-art details mentioned in the present embodiment can also be applied to the third embodiment.
Fig. 7 is a hardware configuration block diagram of an electronic device implementing the method of editing a virtual scene according to the present invention.
As shown in fig. 7, the electronic device 700 may include one or more processors 702, a system motherboard 708 coupled to at least one of the processors 702, system memory 704 coupled to the system motherboard 708, a non-volatile memory (NVM)706 coupled to the system motherboard 708, and a network interface 710 coupled to the system motherboard 708.
The processor 702 may include one or more single-core or multi-core processors. The processor 702 may include any combination of general-purpose processors and dedicated processors (e.g., graphics processors, application processors, baseband processors, etc.). In embodiments of the invention, the processor 702 may be configured to perform one or more embodiments in accordance with the various embodiments shown in fig. 3 and 5.
In some embodiments, the system motherboard 708 may include any suitable interface controllers to provide any suitable interface to at least one of the processors 702 and/or any suitable device or component in communication with the system motherboard 708.
In some embodiments, the system motherboard 708 may include one or more memory controllers to provide an interface to the system memory 704. System memory 704 may be used to load and store data and/or instructions. In some embodiments, system memory 704 of electronic device 700 may include any suitable volatile memory, such as suitable Dynamic Random Access Memory (DRAM).
NVM 706 may include one or more tangible, non-transitory computer-readable media for storing data and/or instructions. In some embodiments, NVM 706 may include any suitable non-volatile memory, such as flash memory, and/or any suitable non-volatile storage device, such as at least one of an HDD (Hard Disk Drive), CD (Compact Disc) Drive, DVD (Digital Versatile Disc) Drive.
NVM 706 may include a portion of a storage resource installed on a device of electronic device 700 or it may be accessible by, but not necessarily a part of, the device. For example, NVM 706 can be accessed over a network via network interface 710.
In particular, system memory 704 and NVM 706 may each include: a temporary copy and a permanent copy of instructions 720. The instructions 720 may include: instructions that, when executed by at least one of the processors 702, cause the electronic device 700 to implement the methods as shown in fig. 3 and 5. In some embodiments, the instructions 720, hardware, firmware, and/or software components thereof may additionally/alternatively be located in the system motherboard 708, network interface 710, and/or processor 702.
The network interface 710 may include a transceiver to provide a radio interface for the electronic device 700 to communicate with any other suitable device (e.g., front end module, antenna, etc.) over one or more networks. In some embodiments, the network interface 710 may be integrated with other components of the electronic device 700. For example, the network interface 710 may be integrated with at least one of the processors 702, the system memory 704, the NVM 706, and a firmware device (not shown) having instructions that, when executed by at least one of the processors 702, the electronic device 700 implements one or more of the various embodiments illustrated in fig. 3 and 5.
The network interface 710 may further include any suitable hardware and/or firmware to provide a multiple-input multiple-output radio interface. For example, network interface 710 may be a network adapter, a wireless network adapter, a telephone modem, and/or a wireless modem.
In one embodiment, at least one of the processors 702 may be packaged together with one or more controllers for a system motherboard 708 to form a System In Package (SiP). In one embodiment, at least one of the processors 702 may be integrated on the same die with one or more controllers for a system motherboard 708 to form a system on a chip (SoC).
The electronic device 700 may further include: an input/output (I/O) device 712 is coupled to the system motherboard 708. I/O device 712 may include a user interface to enable a user to interact with electronic device 700; the design of the peripheral component interface enables peripheral components to also interact with the electronic device 700. In some embodiments, the electronic device 700 further includes a sensor for determining at least one of environmental conditions and location information associated with the electronic device 700.
In some embodiments, I/O devices 712 may include, but are not limited to, a display (e.g., a liquid crystal display, a touch screen display, etc.), a speaker, a microphone, one or more cameras (e.g., still image cameras and/or video cameras), a flashlight (e.g., a light emitting diode flash), and a keyboard.
In some embodiments, the peripheral component interfaces may include, but are not limited to, a non-volatile memory port, an audio jack, and a power interface.
In some embodiments, the sensors may include, but are not limited to, a gyroscope sensor, an accelerometer, a proximity sensor, an ambient light sensor, and a positioning unit. The positioning unit may also be part of the network interface 710 or interact with the network interface 710 to communicate with components of a positioning network, such as Global Positioning System (GPS) satellites.
It is to be understood that the illustrated structure of the embodiment of the invention is not to be construed as a specific limitation to the electronic device 700. In other embodiments of the present application, the electronic device 700 may include more or fewer components than shown, or combine certain components, or split certain components, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Program code may be applied to input instructions to perform the functions described in this disclosure and to generate output information. The output information may be applied to one or more output devices in a known manner. For purposes of this application, a system for processing instructions that includes the processor 702 includes any system having a processor such as a Digital Signal Processor (DSP), a microcontroller, an Application Specific Integrated Circuit (ASIC), or a microprocessor.
The program code may be implemented in a high level procedural or object oriented programming language to communicate with a processing system. The program code can also be implemented in assembly or machine language, if desired. Indeed, the mechanisms described in this disclosure are not limited in scope to any particular programming language. In any case, the language may be a compiled or interpreted language.
One or more aspects of at least one embodiment may be implemented by instructions stored on a computer-readable storage medium, which when read and executed by a processor, enable an electronic device to implement the methods of the embodiments described herein.
The present invention also provides a computer-readable storage medium having stored thereon computer-executable instructions for execution by a processor to implement the method of editing a virtual scene described above.
The present invention also provides a computer program product comprising computer executable instructions for execution by a processor to implement the method of editing a virtual scene described above.
While the invention has been shown and described with reference to certain preferred embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention.

Claims (13)

1. A method of editing a virtual scene, the method being for an electronic device and the method comprising: determining a first scene element and a second scene element in a virtual scene to be edited, wherein the first scene element and the second scene element have different phase attributes;
setting structural attributes of the first scene element and the second scene element respectively, and determining the positions of the first scene element and the second scene element in the virtual scene to be edited and/or the relative positions between the first scene element and the second scene element, wherein the first scene element and the second scene element at least partially overlap in the space of the virtual scene to be edited.
2. The method of claim 1, further comprising:
rendering and displaying the first scene element and the second scene element in superposition.
3. The method of claim 2, wherein the overlappingly rendering and displaying the first scene element and the second scene element comprises:
superimposing the first scene element and the second scene element, rendering the superimposed first scene element and second scene element, and displaying the rendered first scene element and second scene element.
4. The method of claim 2, wherein the overlappingly rendering and displaying the first scene element and the second scene element comprises:
rendering the first scene element and the second scene element, superimposing the rendered first scene element and the rendered second scene element, and displaying the superimposed first scene element and the superimposed second scene element.
5. The method of claim 3, further comprising:
setting display attributes of the first scene element and the second scene element respectively;
superimposing the first scene element and the second scene element based on display attributes of the first scene element and the second scene element;
rendering the first and second superimposed scene elements; and
displaying the rendered first scene element and the rendered second scene element.
6. The method of claim 4, further comprising:
setting display attributes of the first scene element and the second scene element respectively;
superimposing the rendered first scene element and the rendered second scene element based on display attributes of the first scene element and the second scene element; and
displaying the first scene element and the second scene element after being superimposed.
7. Method according to any of claims 1-6, wherein when editing one of said first and second scene elements, the other scene element is hidden.
8. The method of any of claims 1-6, wherein the phase property comprises at least one of physical form, flowability, viscosity, and compressibility.
9. The method of any of claims 1-6, wherein the structural attribute comprises at least one of a shape and a size.
10. The method of claim 5 or 6, wherein the display attribute comprises at least one of color and transparency.
11. An electronic device, comprising a processor and a memory storing computer-executable instructions, the processor being configured to execute the instructions to implement the method of editing a virtual scene of any of claims 1-10.
12. A computer-readable storage medium having computer-executable instructions stored thereon, the instructions being executable by a processor to implement the method of editing a virtual scene of any of claims 1-10.
13. A computer program product comprising computer executable instructions, wherein the instructions are executed by a processor to implement the method of editing a virtual scene of any one of claims 1 to 10.
CN202210036337.2A 2022-01-13 2022-01-13 Method, apparatus, medium, and computer program product for editing virtual scene Pending CN114367113A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210036337.2A CN114367113A (en) 2022-01-13 2022-01-13 Method, apparatus, medium, and computer program product for editing virtual scene

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210036337.2A CN114367113A (en) 2022-01-13 2022-01-13 Method, apparatus, medium, and computer program product for editing virtual scene

Publications (1)

Publication Number Publication Date
CN114367113A true CN114367113A (en) 2022-04-19

Family

ID=81143567

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210036337.2A Pending CN114367113A (en) 2022-01-13 2022-01-13 Method, apparatus, medium, and computer program product for editing virtual scene

Country Status (1)

Country Link
CN (1) CN114367113A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115461707A (en) * 2022-07-08 2022-12-09 上海莉莉丝科技股份有限公司 Video acquisition method, electronic device, storage medium, and program product

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115461707A (en) * 2022-07-08 2022-12-09 上海莉莉丝科技股份有限公司 Video acquisition method, electronic device, storage medium, and program product
CN115461707B (en) * 2022-07-08 2023-10-13 上海莉莉丝科技股份有限公司 Video acquisition method, electronic device and storage medium

Similar Documents

Publication Publication Date Title
EP3657327B1 (en) Method for rendering game, and method, apparatus and device for generating game resource file
US9607437B2 (en) Generating augmented reality content for unknown objects
TWI698841B (en) Data processing method and device for merging map areas
US20080198158A1 (en) 3D map display system, 3D map display method and display program
CN111803945B (en) Interface rendering method and device, electronic equipment and storage medium
US20120075433A1 (en) Efficient information presentation for augmented reality
CN109615686B (en) Method, device, equipment and storage medium for determining potential visual set
TW201539294A (en) Cross-platform rendering engine
CN109324796B (en) Interface layout method and device
CN111968214B (en) Volume cloud rendering method and device, electronic equipment and storage medium
US8522201B2 (en) Methods and apparatus for sub-asset modification
CN108052565B (en) Transparent processing method and device for page
TWI698834B (en) Methods and devices for graphics processing
CN110806847A (en) Distributed multi-screen display method, device, equipment and system
CN114367113A (en) Method, apparatus, medium, and computer program product for editing virtual scene
US9454554B1 (en) View dependent query of multi-resolution clustered 3D dataset
CN111210486B (en) Method and device for realizing streamer effect
US20210241539A1 (en) Broker For Instancing
CN111589111A (en) Image processing method, device, equipment and storage medium
CN117032605A (en) Multi-screen synchronous display method and device, electronic equipment and storage medium
CN110989979A (en) Terrain generation method based on UE engine
CN108171784B (en) Rendering method and terminal
CN116993897A (en) Shadow rendering method, shadow rendering device, electronic equipment and readable storage medium
CN113436344B (en) Reference view display method, system and image display device
CN117131296A (en) Content presentation method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination