CN117282106A - Game editing method, game editing device, storage medium and electronic apparatus - Google Patents

Game editing method, game editing device, storage medium and electronic apparatus Download PDF

Info

Publication number
CN117282106A
CN117282106A CN202311315514.1A CN202311315514A CN117282106A CN 117282106 A CN117282106 A CN 117282106A CN 202311315514 A CN202311315514 A CN 202311315514A CN 117282106 A CN117282106 A CN 117282106A
Authority
CN
China
Prior art keywords
game
game scene
scene
editing
instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311315514.1A
Other languages
Chinese (zh)
Inventor
彭凡晨
许帆
高峰
谭宇晗
罗鸣
陈琪欣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202311315514.1A priority Critical patent/CN117282106A/en
Publication of CN117282106A publication Critical patent/CN117282106A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/63Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor by the player, e.g. authoring using a level editor
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface

Abstract

The disclosure provides a game editing method, a game editing device, a storage medium and electronic equipment, and relates to the technical field of game and man-machine interaction. The method comprises the following steps: displaying a game scene editing interface in a graphical user interface provided by running a game program; responding to game scene association instructions in a game scene editing interface, and associating at least two game scenes, wherein the at least two game scenes are independent game scenes; responding to an editing instruction of the game scene to be edited, and editing the game scene to be edited according to the editing instruction; the game scene to be edited is a game scene of at least two game scenes; generating a game scene combination comprising at least two game scenes; the game scenario assembly is configured to provide successive game stages during a game play phase. The editing mode capable of generating the game scene combination is beneficial to improving the variety of game scene types and playing methods and improving editing efficiency.

Description

Game editing method, game editing device, storage medium and electronic apparatus
Technical Field
The present disclosure relates to the field of game and man-machine interaction technologies, and in particular, to a game editing method, a game editing device, a computer-readable storage medium, and an electronic apparatus.
Background
With the development of computer and man-machine interaction technology, games have become one of the important ways of people's daily leisure and recreation. In some games, editing of the game scene is allowed. For example, a user may edit a layout, background, or even gameplay in a game scene to obtain a customized game scene. However, the current game scene editing mode is relatively fixed, so that the type of the edited game scene is relatively single, and the editing efficiency is low.
Disclosure of Invention
The present disclosure provides a game editing method, a game editing device, a computer-readable storage medium, and an electronic apparatus to solve the problems of single type of game scene and low editing efficiency at least to some extent.
According to a first aspect of the present disclosure, there is provided a game editing method, the method comprising: displaying a game scene editing interface in a graphical user interface provided by running a game program; responding to game scene association instructions in the game scene editing interface, and associating at least two game scenes, wherein the at least two game scenes are independent game scenes; responding to an editing instruction of a game scene to be edited, and editing the game scene to be edited according to the editing instruction; the game scene to be edited is a game scene in the at least two game scenes; generating a game scene combination comprising the at least two game scenes; the game scenario combination is configured to provide successive game stages at a game run stage, the successive game stages including a game stage corresponding to each of the game scenarios.
According to a second aspect of the present disclosure, there is provided a game editing device, the device comprising: the editing interface display module is configured to display a game scene editing interface in a graphical user interface provided by running a game program; a game scene association module configured to associate at least two game scenes in response to a game scene association instruction in the game scene editing interface, wherein the at least two game scenes are independent game scenes; the game scene editing module is configured to respond to an editing instruction of a game scene to be edited, and edit the game scene to be edited according to the editing instruction; the game scene to be edited is a game scene in the at least two game scenes; a game scene combination generation module configured to generate a game scene combination including the at least two game scenes; the game scenario combination is configured to provide successive game stages at a game run stage, the successive game stages including a game stage corresponding to each of the game scenarios.
According to a third aspect of the present disclosure, there is provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the game editing method of the first aspect described above and possible implementations thereof.
According to a fourth aspect of the present disclosure, there is provided an electronic device comprising: a processor; and a memory for storing executable instructions of the processor; wherein the processor is configured to perform the game editing method of the first aspect and possible implementations thereof via execution of the executable instructions.
The technical scheme of the present disclosure has the following beneficial effects:
on the one hand, the editing mode capable of generating the game scene combination is provided, and the game scene combination can provide continuous game stages in a game running stage, so that the limit that only a single game scene can be edited in the related technology is broken through, and the editing mode is beneficial to improving the variety of the edited game scene types and playing methods. On the other hand, the game scene in the game scene combination can be conveniently and rapidly edited, so that the user operation is simplified, and the editing efficiency is improved.
Drawings
Fig. 1 shows a system architecture diagram in the present exemplary embodiment;
fig. 2 shows a flowchart of a game editing method in the present exemplary embodiment;
FIG. 3A shows a schematic diagram of a game scene editing interface in the present exemplary embodiment;
FIG. 3B illustrates a schematic diagram of another game scene editing interface in the present exemplary embodiment;
FIG. 4 shows a schematic diagram of yet another game scene editing interface in the present exemplary embodiment;
FIG. 5 illustrates a flow chart of one method of determining and updating a game scene sequence in the present exemplary embodiment;
FIG. 6 shows a schematic diagram of a preview interface of an associated game scene in the present exemplary embodiment;
fig. 7 shows a schematic diagram of setting game progress information in the present exemplary embodiment;
fig. 8A shows a schematic diagram of information specifying a game scene in the present exemplary embodiment;
fig. 8B shows a schematic diagram of a preparation screen in the present exemplary embodiment;
fig. 9 is a schematic diagram showing the structure of a game editing device in the present exemplary embodiment;
fig. 10 shows a schematic structural diagram of an electronic device in the present exemplary embodiment.
Detailed Description
Exemplary embodiments of the present disclosure will be described more fully hereinafter with reference to the accompanying drawings.
The drawings are schematic illustrations of the present disclosure and are not necessarily drawn to scale. Some of the block diagrams shown in the figures may be functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in software, or in hardware modules or integrated circuits, or in networks, processors or microcontrollers. Embodiments may be embodied in many forms and should not be construed as limited to the examples set forth herein. The described features, structures, or characteristics of the disclosure may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough description of embodiments of the present disclosure. However, it will be recognized by one skilled in the art that one or more of the specific details may be omitted, or other methods, components, devices, steps, etc. may be used instead of one or more of the specific details in implementing the aspects of the present disclosure.
The present inventors have found that in the related art, the game scene editing mode supports only editing of a single game scene, but does not support the editing modes of a plurality of game scenes, which results in a single type of edited game scene and low editing efficiency.
In view of the above, exemplary embodiments of the present disclosure provide a game editing method.
Fig. 1 shows a system architecture diagram of an operating environment of the present exemplary embodiment. The system architecture may include a client 110, a server 120. Wherein the client 110 is a terminal device that installs and runs a game client program. The terminal device may be a mobile phone, a tablet computer, a personal computer, an intelligent wearable device, a game machine, or the like, which has a display function and is capable of displaying a graphical user interface, and the graphical user interface may include an interface of an operating system or an interface of an application program, or the like. The server 120 generally refers to a background system that provides game services in the present exemplary embodiment, and may be one server or a cluster of multiple servers. The server 120 is deployed with a game server program for executing game data processing of the server. The client 110 and the server 120 may form a connection through a wired or wireless communication link for data transmission. The game editing method in the present exemplary embodiment may be executed by any one or more of the client 110 and the server 120.
In one embodiment, the game editing method may be implemented and executed based on a cloud interaction system. The cloud interaction system may be the system architecture described above. Various cloud applications can be run under the cloud interaction system, for example: and (5) cloud game. Taking cloud game as an example, cloud game refers to a game mode based on cloud computing. In the cloud game operation mode, the game program operation main body and the game picture presentation main body are separated, and the storage and operation of the control and interaction method in the game are completed on a cloud game server (such as the server 120), and the function of a cloud game client (such as the client 110) is used for receiving and sending data and presenting game pictures. For example, the cloud game client may be a display device with a data transmission function near the user side, such as a mobile terminal, a television, a computer, a palm computer, etc.; and the cloud game server for information processing is a cloud game server. When playing a game or editing a game scene, a user operates a cloud game client to send an operation instruction to a cloud game server, the cloud game server runs a game according to the operation instruction, codes and compresses data such as a game picture and the like, returns the data to the cloud game client through a network, and finally decodes the data through the cloud game client and outputs the game picture.
In one embodiment, the game editing method may be implemented in a stand-alone game. Without the need to deploy a server, all game programs can be installed by the client 110 and the game editing method can be executed.
In one embodiment, referring to fig. 2, the game editing method may include the following steps S210 to S240:
step S210, displaying a game scene editing interface in a graphical user interface provided by running a game program;
step S220, responding to game scene association instructions in a game scene editing interface, and associating at least two game scenes, wherein the at least two game scenes are independent game scenes;
step S230, responding to an editing instruction of the game scene to be edited, and editing the game scene to be edited according to the editing instruction; the game scene to be edited is one of the at least two game scenes;
step S240, generating a game scene combination comprising at least two game scenes; the game scenario combination is configured to provide successive game stages during a game run phase, the successive game stages including a corresponding game stage for each game scenario.
In the method shown in fig. 2, on the one hand, an editing mode capable of generating a game scene combination is provided, and the game scene combination can provide continuous game stages in a game running stage, so that the limit that only a single game scene can be edited in the related art is broken through, and the editing mode is beneficial to improving the variety of edited game scene types and playing methods. On the other hand, the game scene in the game scene combination can be conveniently and rapidly edited, so that the user operation is simplified, and the editing efficiency is improved.
Each step in fig. 2 is described in detail below.
Referring to fig. 2, in step S210, a game scene editing interface is displayed in a graphical user interface provided by running a game program.
When the terminal device runs the game program, a game scene editing interface can be displayed in the graphical user interface. The game program may be a game main program, in which a game scene editing function (e.g., a game editor is built in the game program) is provided, and when the user uses the function, a game scene editing interface may be displayed in the game main program. Alternatively, the game program may be an editing program that is compatible with the game main program, and the editing program may be executed independently without executing the game main program, and when the game scene editing is performed using the editing program, a game scene editing interface may be displayed.
The present exemplary embodiment supports custom editing of game scenes by players, and thus, users herein may refer to game makers (e.g., artwork) of game manufacturers, as well as players.
The game scene editing interface displayed in step S210 may be an editing interface of a specific certain game scene. For example, the user may choose to create a new game scene and edit it, or may choose to edit an existing game scene. The newly created game scene or the selected game scene is the game scene in the current editing, and is recorded as a first game scene for convenience of explanation. In step S210, an editing interface of the first game scene may be displayed. Or further, the editing interface may be an editing setting interface of the first game scene. The game scene editing interface displayed in step S210 may be a general editing interface that is not specific to a certain game scene. For example, in a case where the user does not select to edit a certain game scene, an editing interface of a blank game scene, a game scene editing setting interface, or the like may be displayed.
With continued reference to fig. 2, in step S220, at least two game scenes are associated in response to the game scene association instruction in the game scene editing interface, wherein the at least two game scenes are independent of each other.
The game scene association instruction is used for associating a plurality of game scenes into a whole. For example, a game scenario association control may be provided in the game scenario editing interface, through which a user may select a plurality of game scenarios (which may include newly created game scenarios) and associate the game scenarios.
The associated game scenes themselves may be independent of each other, i.e. each game scene may independently form a game level independent of the other game scenes. For example, the individual game levels may form a game map by issuing instructions alone for a player to obtain the game map from a game server for a game experience before the game level is associated. Each game scene may have independent play rules, scene styles, peripheral information, etc. The peripheral information may include, but is not limited to, scene cover charts, thumbnails, scene names, scene sources, editing information (e.g., creation time, last editing time), other setup information (e.g., number of players supported), and the like. For example, the user associates the game scene D1 with the game scene D2 through the game scene association instruction, the game scene D1 is a game stage of racing play, the scene style thereof is a city street, the game scene D2 is a game stage of survival play, the scene style thereof is mountain forest, and the game scenes D1 and D2 are game scenes independent of each other. Of course, the disclosure is not limited thereto, and there may be some commonality between different game scenarios.
In one embodiment, after associating at least two game scenes, index information of the at least two game scenes may be formed into a set. The index information of the game scene may be an identification (such as a name, a number, a file descriptor) of the game scene, storage address information (such as a path, a link, a storage position in a memory) of the game scene, and the like, and the complete game scene file may be found through the index information of the game scene. The index information of the game scenes is formed into a set, so that the association of the game scenes can be realized, and each game scene file can be searched and loaded through the set later.
In one embodiment, the at least two game scenes include a first game scene and at least one second game scene; the game scene editing interface includes an editing interface of the first game scene. The associating at least two game scenes in response to the game scene association instruction in the game scene editing interface may include the steps of:
and responding to an instruction for adding the associated game scene for the first game scene in the editing interface of the first game scene, taking the first game scene as a main game scene, and taking the added second game scene as the associated game scene of the main game scene.
When editing the first game scene, the user may input an instruction to add an associated game scene to the first game scene in an editing interface of the first game scene, and the second game scene is a game scene added by the instruction. And forming a multi-game scene association structure of 'main game scene+associated game scene' by taking the first game scene as the main game scene and the second game scene as the associated game scene of the main game scene, and forming a game scene combination subsequently.
FIG. 3A shows a schematic diagram of an editing interface for a first game scenario. In the editing interface, a "level" represents a game scene. The name of the first game scene is AABB, and the added second game scene comprises AABB-2, AABB-3 and AABB-4. The game scene combination that first game scene and second game scene are constituteed is multilayer level structure, and first game scene is the main game scene, and it is located the layer 1 of multilayer level, and the second game scene is the association game scene, and it is located layer 2 and later of multilayer level.
In one embodiment, the above instruction for adding an associated game scene to the first game scene may include: the game scene is newly established as an instruction of the associated game scene. For example, in FIG. 3A, the user, upon selecting to add an associated game scene (i.e., clicking "+increasing the number of layers"), displays the interface of FIG. 3B, the user may further select a new game scene, thereby triggering the generation of a new game scene and its associated game scene as the first game scene. The new game scene may employ a default game scene template (with default scene components, background, etc.), or may be a blank game scene or a copy of the first game scene, etc. Through the operation mode, the game scene can be quickly established in the editing mode of the game scene combination, and the operation process of a user is simplified.
In one embodiment, the game editing method may further include the steps of:
after a new game scene is generated by creating the game scene as an instruction of the associated game scene, generating an initial identification of the new game scene according to the identification of the first game scene and the sequence of the new game scene in the game scene combination.
Wherein, the identification of the first game scene can be a number, a name, etc. The first game scene and its associated game scenes may have a certain order in the game scene combination, such as the number of layers of the first game scene, the second game scene shown in fig. 3A or fig. 3B. The sequence may be the order of occurrence of the game scenes at the game run stage. In one embodiment, the first game scene may default to the first scene in the game scene combination, as in fig. 3A, the first game scene AABB defaults to layer 1, and the order of the respective second game scenes may be determined according to the addition order thereof, as in the editing interface of the first game scene, a new game scene is first generated by creating a game scene as an instruction of the associated game scene and as the associated game scene of the first game scene, and then the new game scene is the second scene in the game scene combination, that is, the order (or the number of layers) is 2.
The identification of the first game scene may be combined with the order of the new game scene in the game scene combination. Illustratively, the identification of the first game scene is a scene name saved when the user edits the first game scene, the first game scene is AABB, the new game scene is an xth scene in the game scene combination, and the AABB and the X are combined to obtain an initial identification of the new game scene, such as AABB-X, AABBX. The initial mark obtained by the method can clearly show the relation between the new game scene and the first game scene, and is convenient for users to understand.
After generating the initial identification of the game scenario, the user may be allowed to make modifications, such as the user may rename the new game scenario.
In one embodiment, the above instruction for adding an associated game scene to the first game scene may include: instructions for selecting an associated game scene from among the existing game scenes. For example, in fig. 3A, the user displays the interface of fig. 3B after selecting to add an associated game scene (i.e., clicking "+increasing the number of layers"), the user may further select to add from the existing game scenes, thereby triggering a selection interface to display the existing game scenes, from which the user may select one or more game scenes, which are identified as associated game scenes of the first game scene. Existing game scenarios may include, but are not limited to: a game scene created by a user; creating by others, and editing the game scene by the user; the game system may be an open editable game scene, which may include a game scene of the game itself, or the like. Through the operation mode, the existing game scene can be quickly called in the editing mode of the game scene combination, and the operation process of a user is simplified.
After the existing game scene is used as the associated game scene of the first game scene, the original identification of the existing game scene can be used, or a new identification can be generated for the existing game scene, for example, the new identification of the existing game scene can be generated according to the identification of the first game scene and the sequence of the new game scene in the game scene combination, or the user can rename the existing game scene.
In one embodiment, the game editing method may further include, before responding to the instruction to add the associated game scene to the first game scene in the editing interface of the first game scene, the steps of:
and providing an associated scene adding control in response to setting the playing method to be a preset playing method in the editing interface of the first game scene.
Wherein, the preset playing method can be playing method supporting continuous checkpoints, such as PVE (Player VS Environment, player against environment) playing method and the like. If the playing method is set to be the preset playing method in the editing interface of the first game scene, the first game scene which is set currently is indicated to support continuous level, so that the editing mode of the game scene combination is allowed to be adopted, and the associated scene adding control is provided. Accordingly, the instruction for adding the associated game scene to the first game scene may be an instruction input through an associated scene adding control. I.e. at this point the addition of an associated game scene for the first game scene is allowed. Otherwise, if the playing method is set to be a non-preset playing method in the editing interface of the first game scene, the currently set first game scene does not support continuous level, so that the editing mode of the game scene combination cannot be adopted, no associated scene adding control is provided, and the associated game scene is not allowed to be added for the first game scene.
Fig. 4 shows a schematic diagram of setting playing methods in an editing interface of a first game scene, a user can select among running, PVE and other playing methods, if the PVE playing method is selected (i.e. the preset playing method is selected), an associated scene adding control is triggered to be displayed, namely a level number option box shown in fig. 4, and the user can click "+increase the level number" to trigger to add the associated game scene.
In one embodiment, after providing the associated scene adding control or adding the associated game scene to the first game scene, the play setting options in the editing interface of the first game scene may be locked, so that the user cannot change the play any more. Avoiding the user from changing the playing method into a non-preset playing method again, thereby causing system errors.
In one embodiment, after the first game scene is taken as the main game scene and the added second game scene is taken as the associated game scene of the main game scene, the index information of the associated game scene can be added into the data (such as attribute data, scene description data and the like) of the main game scene. When the main game scene is loaded, the file of the associated game scene can be searched through the index information of the associated game scene, and then the associated game scene is loaded, so that the association between the main game scene and the associated game scene is realized.
With continued reference to fig. 2, in step S230, in response to an editing instruction of a game scene to be edited, the game scene to be edited is edited according to the editing instruction; the game scene to be edited is a game scene of the above at least two game scenes.
The game scene to be edited is a game scene selected by a user for editing from the at least two game scenes, and can be any game scene. Editing instructions for a game scene to be edited may include, but are not limited to: adding scene components, deleting scene components, moving, scaling or rotating scene components, adjusting the properties of scene components, editing scene background and other instructions.
In one embodiment, the editing of the game scene according to the editing instruction in response to the editing instruction of the game scene may include the steps of:
responding to an editing triggering instruction of the game scene to be edited, and entering an editing interface of the game scene to be edited;
responding to an editing instruction in an editing interface of the game scene to be edited, and editing the game scene to be edited according to the editing instruction.
Wherein the game scene editing interface may provide an editing trigger control or option for each game scene, for example, an editing trigger control may be provided beside an icon or a logo of each game scene, such as an "edit" control on the right side of each game scene (since the first game scene AABB is in an editing state, the control thereof is not displayed) as shown in fig. 3A or fig. 3B, and when the user clicks the control of the game scene to be edited, the editing interface into the game scene to be edited is triggered. Or, the editing trigger instruction may not be generated by a control, for example, after a new game scene is created as an associated game scene, an editing interface for entering the new game scene may be triggered, or after an existing game scene is selected as an associated game scene, an editing interface for entering the game scene may be triggered.
In the editing interface of the game scene to be edited, the user can perform one or more editing instructions, and the game program edits the game scene to be edited according to the editing instructions. After completing the editing of the game scene to be edited, the user may exit the editing interface of the game scene to be edited and return to the game scene editing interface in step S210.
Therefore, each game scene can be conveniently edited in the editing mode of the game scene combination, and the operation convenience and the editing efficiency are improved.
In one embodiment, the editing status of each game scene may be displayed in the game scene editing interface, such as the information of "being edited", "edited before 5 minutes", etc. shown in fig. 3A or 3B, to more clearly show the editing situation of each game scene. If the user clicks the edit trigger control of the other game scenes, for example, when the user clicks the edit trigger control "edit" of the game scene AABB-2 in the interface of fig. 3A, the prompt message "whether the game scene AABB is currently edited and exits the editing and saving of the AABB" may be presented, and the user selects "yes" to exit the editing state of the game scene AABB and enter the editing interface of the game scene AABB-2.
The editing interface of the game scene to be edited can comprise scene background and one or more generated scene components, wherein the scene components can be self-contained scene components in the game scene to be edited (such as a game program can provide a plurality of different types of preset game scenes, a user can select a certain preset game scene to edit, and the user can initially self-contained scene components) or can edit the generated scene components. The scene component may be a person, thing or a partial person, thing in the game, such as an NPC (Non-Player Character), etc. After the editing of the game scene to be edited is completed, when the game scene to be edited is loaded and operated as a game operation scene, a scene component in the game scene to be edited can generate a corresponding virtual scene model in the game operation scene.
The scenario component may be set in the game scenario by a scenario component setting instruction. In one embodiment, where a game scene editing interface is displayed, one or more scene component selection controls may also be displayed in the graphical user interface, the scene component setting instructions being instructions implemented by the scene component selection controls. For example, the scene component selection controls may include controls of "tile component," "cylindrical component," "semi-cylindrical component," and the like, and responsive to user manipulation of the controls, corresponding scene components may be generated in the game scene, such as a user clicking on the "tile component" control, tile components may be generated in the game scene.
In one embodiment, the game program may be self-contained with one or more scene components, such as scene components that may be preconfigured by an artist and stored in the game program, and may provide their corresponding scene component selection controls so that a player may conveniently use these scene components for game scene editing, such as may add a scene component in one-touch in an editing interface of a game scene to be edited.
In one embodiment, the scene components may be preconfigured by the player, who may obtain scene components that were originally absent in the game program by modeling in the game scene editing interface or other editing interfaces, or may combine the original scene components in the game program to form a combined scene component. For a player pre-configured scene component, a corresponding scene component selection control may also be provided. When the scene component is preconfigured, one or more of the information of the size, position, direction, color, texture, shape, etc. of the scene component can be configured. Therefore, when a user uses the scene components in an editing interface of a game scene to be edited, the configured information can be directly called, and the game scene editing method is very convenient and efficient. Of course, the user may also adjust the configured information in the scene component, such as adjusting one or more of the above information, so as to better meet the needs and preferences of the user.
In one embodiment, a virtual camera may be provided in the game scene to be edited. The virtual camera is a tool for simulating a real camera in a game program to shoot a game scene picture, can be arranged at any position in a game scene to be edited, shoots the game scene at any view angle, namely the virtual camera can have any pose in the game scene to be edited, and the pose can be fixed or dynamically changed. In addition, any number of virtual cameras can be arranged in the game scene to be edited, and different virtual cameras can shoot different game scene pictures.
The game scene to be edited can present different view angles, such as a viewing view angle and a game view angle, and a user can select which view angle to use in a relevant setting interface of the game scene to be edited. The viewing angle refers to the viewing angle of a third person to view a game scene to be edited, and under the viewing angle, a user may not manipulate a game character in the game scene to be edited, but directly manipulate a virtual camera to move the viewing angle (the virtual camera is not displayed). The game view angle refers to a view angle of a first person to observe a game scene to be edited, under the game view angle, a user can control a certain game role in the game scene to be edited, the game role can be bound with a virtual camera, namely, the position relationship between the game role and the virtual camera is fixed, for example, the game role can be positioned at the focus of the virtual camera, and when the user controls the game role to move, the virtual camera synchronously moves, so that the view angle is moved. Of course, under the observation view angle, invisible game characters can be set in the game scene to be edited, which is equivalent to hiding the game characters under the game view angle, and when the user moves the view angle, the user can move the virtual camera by moving the game characters. Under the observation view angle or the game view angle, a virtual rocker, an up-shift or down-shift control and the like can be arranged in the game scene to be edited, and a user can move the virtual camera or move the game character by operating the control.
With continued reference to FIG. 2, in step S240, a game scene combination is generated that includes at least two game scenes described above; the game scenario combination is configured to provide successive game stages during a game run phase, the successive game stages including a corresponding game stage for each game scenario.
Wherein, the game scene combination can be generated after the at least two game scenes are associated. The game scene composition may also be generated in response to an edit completion instruction in the game scene editing interface, such as when the user clicks "exit and hold" or "issue a level" in the interface shown in fig. 3A or 3B.
The game scenario combinations provide successive game levels during the game run phase, each game level being generated by a game scenario correspondence in the game scenario combination. The game scene combination includes a game scene D1 and a game scene D2, and when a user selects the game scene combination to play, the game scene combination may first enter a game stage corresponding to the game scene D1, and after the game is cleared, the game stage corresponding to the game scene D2 is continued to enter, so as to realize a game mode of a continuous game stage.
The game scene combination is a multi-game scene associated data structure, and compared with the sum of each game scene, the data quantity and the storage and loading pressures can be reduced.
In one embodiment, the game scene combination may be generated based on the set formed by the index information of each game scene and the peripheral information of the game scene combination. The peripheral information of the game scene composition may include, but is not limited to, a cover map, a thumbnail, a name, a scene source, editing information (e.g., creation time, last editing time), other setting information (e.g., number of players supported), etc. of the game scene composition. The set of index information of the game scene and the peripheral information of the game scene combination can be packaged into a file to obtain the file of the game scene combination.
In one embodiment, after adding the index information of the associated game scene to the data of the main game scene, a game scene combination may be generated based on the data of the main game scene. Such as a complete data file of the main game scene may be used as a file of the game scene combination.
In one embodiment, the game scene combination has an order, representing the order among the game scenes therein. For example, in the above example, the order of the game scene D1 is 1, the order of the game scene D2 is 2, and if the game is played using the game scene combination, the game stage corresponding to the game scene D1 is entered first, and then the game stage corresponding to the game scene D2 is entered. If the order of the game scene D2 is 1 and the order of the game scene D1 is 2, the order of the corresponding two game stages should be reversed. The sequence may be embodied by a data structure in the game scene composition. For example, in the above set of index information of game scenes, the index information of each game scene may be arranged in order, for example, the set may be in a form of a queue, etc., and the arrangement order of the index information is the order of the game scenes in the game scene combination. Alternatively, in the data of the main game scene, the index information of each associated game scene may be arranged in order, that is, the order of the associated game scenes in the game scene combination (the order of the main game scene defaults to 1).
In one embodiment, icons or logos displaying the game scenes may be arranged in the game scene editing interface in the order of the game scenes in the game scene combination so that the user can clearly see the order of the game scenes.
In one embodiment, the game editing method may further include the steps of:
responding to the game running instruction, loading and displaying a current game level obtained by running a game scene in the game scene combination;
responding to the gate settlement instruction, and settling accounts according to the game behaviors of the virtual character at the current game gate;
determining a target game scene according to the sequence in the game scene combination;
and loading and displaying the target game level obtained by running the target game scene.
The game running instruction may be an instruction to play a game using a game scene combination, such as an instruction that a user selects the game scene combination at a game preparation interface and clicks to start the game. When a game scene combination is selected for playing, which game scene corresponds to the game stage can be determined according to preset logic, for example, the game can be started from the game stage corresponding to the 1 st game scene in sequence. After the current game level is cleared or a game ending condition is achieved (for example, the set game longest time is reached), a level settlement instruction can be generated, or the user can input the level settlement instruction, for example, the user can select to end the current game level and enter the next game level, the level settlement instruction is triggered at this time, and the settlement is carried out on the game behavior of the current game level according to the virtual role controlled by the user. And then the next game level is required to be entered, and the target game level can be determined according to the sequence in the game level combination, if the game level corresponding to the current game level is the 3 rd game level in the game level combination, the 4 th game level is the target game level, and the target game level corresponding to the target game level is the next game level. Further, a target game stage obtained by running the target game scene is loaded and displayed, and the virtual character is caused to enter the target game stage to play.
According to the method, when the game level needs to be entered, the corresponding game scenes are loaded and operated, so that all the game scenes do not need to be loaded at one time, and the load of the client and the server can be reduced.
In one embodiment, the relevant information in the game scene combination may be loaded when the game scene combination is executed, such as a set of index information of the game scenes or index information of the relevant game scene added in the main game scene, and the relevant information is loaded into the memory or the cache, so that when the game stage is skipped (such as the next game stage needs to be entered after the current game stage is settled), the target game scene may be determined according to the relevant information and indexed to the file of the target game scene, and the file is loaded to obtain the target game stage.
In one embodiment, referring to fig. 5, the game editing method may further include the following steps S510 and S520:
step S510, when at least two game scenes are associated, determining the sequence of the game scenes in the game scene combination according to the adding sequence of the game scenes;
step S520, in response to the sequence editing operation for the game scene, the sequence of the game scene in the game scene combination is updated according to the sequence editing operation.
In this case, the addition order of the respective game scenes may be regarded as the order thereof in the game scene combination, for example, the order of the game scenes added first is 1, the order of the game scenes added thereafter is 2, and so on. In addition, the user may be allowed to perform a sequential editing operation for the game scene to change the sequence. For example, in the game scene editing interface, icons or marks of the game scenes are arranged and displayed in the order of the game scenes in the game scene combination, and the user can adjust the arrangement order, such as dragging an icon or mark of a certain game scene to move up and down, so as to update the order of the game scenes. This allows the user the flexibility to adjust the order of the game scenes to edit successive game levels as desired.
In one embodiment, the order of the primary game scenes may be set to be fixed at 1 and not modifiable, with the order of their associated game scenes being arbitrarily adjustable. This ensures the stability of the structure of "main game scene + associated game scene" in the game scene combination.
In one embodiment, after associating at least two game scenes, the game editing method may further include the steps of:
The associated game scene icons and the order of the game scenes in the game scene combination are displayed in a preview interface of the game scenes.
The preview interface of the game scene can be a sub-interface in interfaces such as an edit overview interface and a game scene selection interface. The edit overview interface is an interface for overview of game scenes created by the user or participating in editing, and may include a preview interface of each game scene or a preview interface that displays a certain game scene when the user selects the game scene. The game scene selection interface is an interface that selects a game scene to play, and may include a preview interface of each selectable game scene or a preview interface that displays a certain game scene when the user selects the game scene. The preview interface of the game scene is used for displaying basic information of the game scene, including but not limited to identification of the game scene, icons, thumbnails, cover drawings, creators, editing time, release time, supported playing methods, supported game persons, and the like.
The associated game scene icon is used to indicate that the game scene is a game scene in a game scene combination, not a separate game scene. The order of the game scenes in the game scene combination may be shown in the form of numerals or the like. The related game scene icons and sequences are displayed in the preview interface, so that a user can intuitively and clearly see the related information of the game scene combination.
In one embodiment, for a primary game scene in a game scene combination, a primary game scene icon may be displayed, the order of which is typically 1 by default, so this information of order may not be displayed. For associated game scenes in a game scene combination, associated game scene icons and orders may be displayed. This facilitates the user in distinguishing between the primary game scenario and the associated game scenario.
FIG. 6 shows a schematic view of a preview interface of a game scene. The game scene AABB-3 is an associated game scene of the main game scene AABB, and the preview interface may display a thumbnail 601 of the game scene AABB-3, an associated game scene icon 602, sequence information 603 in the game scene combination, and game scene details (or peripheral information) 604. By associating the game scenario icon 602 with the sequence information 603, information of the game scenario AABB-3 in the game scenario combination can be displayed.
In one embodiment, the game editing method may further include the steps of:
providing a first number of functionality controls in a preview interface of a primary game scene;
providing a second number of functionality controls in a preview interface of the associated game scene; the first number is greater than the second number.
That is, more and richer functionality controls are provided in the preview interface of the main game scene, and only a portion of the functionality controls may be provided in the preview interface of the associated game scene, less than the functionality controls in the preview interface of the main game scene. Illustratively, an edit control is provided in the preview interface of the primary game scene through which a user can trigger an edit interface into the primary game scene or combination of game scenes; a play control may also be provided by which a user may trigger play using a game scene composition; a room list control may also be provided by which a user may create a game room using a game scene composition; a co-creation control may also be provided by which a user may invite other users to co-participate in editing of a primary game scene or combination of game scenes. In the preview interface of the associated game scene, an editing control is not provided, so that the editing interface of the associated game scene cannot be independently accessed, and editing of the associated game scene must be realized by triggering editing of the game scene combination; a play control may not be provided such that the associated game scene alone cannot be used for play; room list controls may not be provided such that game rooms cannot be created using the associated game scenes alone; the co-creation control may not be provided so that other users cannot be invited to participate in editing the associated game scene.
In one embodiment, collection controls, sharing controls, etc. may be provided in a preview interface of the associated game scene for implementing operational functions of collection, sharing, etc. of the associated game scene.
By limiting the functional controls in the preview interface of the associated game scene, the difficult-to-implement operation of the associated game scene can be avoided, and the logic of the game scene combination is prevented from being influenced. And the number of open functions in the associated game scene is reduced, related interfaces such as a preview interface and the like are simplified, and the load of a client or a server is reduced.
In one embodiment, the loading and displaying the current game level obtained by running a game scene in the game scene combination in response to the game running instruction may include the following steps:
responding to a game running instruction, and determining game progress information according to a history clearance record of a game account number participating in a game in the game scene combination;
determining a current game scene in the game scene combination according to the game progress information;
and loading and displaying the current game level obtained by running the current game scene.
For example, if the game scene combination includes 10 game scenes D1 to D10 in total and is arranged in the order of D1 to D10, the user should play in the order of the corresponding game stages D1 to D10 when playing the game using the game scene combination. And if the historical clearance record of the current game account number participating in the game scene combination is the clearance game level D4, determining the game progress information as the game level D5, wherein the game account number can start the game from the game level D5. When the game scene combination is entered for playing, the game scene D5 can be determined as the current game scene, the game scene D5 is operated to obtain the game level D5, and the game level D5 is loaded and displayed for the current game level, namely, the game is directly started from the game level D5.
In one embodiment, if the game scene combination is selected to play the single game, the game progress information may be determined according to the history clearance record of the game account number participating in the single game in the game scene combination, and the current game scene may be determined according to the game progress information, so as to load and display the corresponding current game level. If a game scene combination is selected to play a multiplayer game, the first game scene is taken as the current game scene when the game scene combination is entered, namely, the game starts from the first game stage, or the game progress information is determined according to the historical clearance records of a plurality of game accounts participating in the multiplayer game in the game scene combination, generally, the game progress information is determined according to the lowest historical clearance record, for example, the lowest historical clearance record in the plurality of game accounts is a game stage D3 (namely, the historical clearance records of the plurality of game accounts are not lower than D3), the game progress information is determined as a game stage D4, and when the game scene combination is entered, the game stage D4 can be directly entered.
In one embodiment, if a game scene combination is selected for playing, the user may be allowed to change the game progress information without exceeding the history clearance record according to the history clearance record of the game account number participating in the game scene combination, and if the history clearance record is the cleared game level D4, the game levels D1 to D5 may be selected currently, and the user may select any level from the selected levels to start the game.
Fig. 7 shows a schematic diagram of setting game progress information. A certain game scene combination has 7 game scenes, and the game scenes correspond to the layer 1 level, the layer 2 level, the layer … level, and the layer 7 level, respectively, in the order of the game scenes in the game scene combination. If the historical clearance record of the game account currently participating in the game is the highest clearance layer 4 level, the game progress is allowed to be selected between the layers 1 to 5, and the layers 6 and 7 are in an unlocked state.
In one embodiment, the game editing method may further include the steps of:
and responding to the issuing instruction aiming at the main game scene, issuing the main game scene and the associated game scene into a map pool of the server, so that the game client can download the main game scene and the associated game scene from the map pool.
The map pool of the server may be a game map pool opened and downloaded for the player. If the user issues the main game scene, the main game scene and the related game scene are issued together, the file of the main game scene (the file of the main game scene includes index information of the related game scene) may be uploaded to the map pool, or the file of the main game scene and the file of the related game scene may be uploaded to the map pool together, for example, the data file of the main game scene and the data file of the related game scene may be packaged to form a total file of the game scene combination, and then uploaded to the map pool. The game client (including other users) may download files of the master game scene and associated game scenes therefrom. Thereby enabling fast UGC (User Generated Content ) functionality.
In one embodiment, the game editing method may further include the steps of:
in response to a repudiation instruction for the primary game scenario, requesting the server to remove the primary game scenario and associated game scenario from the map pool.
Wherein, the cancel issue instruction is an inverse instruction of the issue instruction. It should be appreciated that where the primary game scenario has been released, it may be de-released. If the user unpublishes the main game scene, the main game scene and the associated game scene are unpublished together, and the server is requested to remove the file of the main game scene from the map pool or remove the files of the main game scene and the associated game scene, so that the game client cannot download the map file from the map pool.
In one embodiment, the editing of the game scene to be edited according to the editing instruction in response to the editing instruction of the game scene to be edited may include the steps of:
displaying a game scene to be edited in a graphical user interface;
setting a first functional component in a game scene to be edited;
in response to a first setting instruction for the first functional component, a specified game scenario is determined from the associated at least two game scenarios.
Accordingly, the game editing method may further include the steps of:
in a game running stage corresponding to a game scene to be edited, responding to the trigger of a first component object corresponding to a first functional component, and determining a designated game scene associated with the first functional component;
the virtual character is controlled to enter a designated game scene.
The first functional component is a scene component with a transmission function, and in the game running stage, the first functional component can correspondingly generate a first component object, such as a transmission gate, a transmission point and the like, and has the function of transmitting the virtual character.
When editing the game scene to be edited, the user can add the first functional component in the game scene to be edited, for example, the first functional component can be placed at any position in the game scene to be edited through a selection control of the first functional component. Furthermore, the user may input a first setting instruction for the first functional component, where the first setting instruction is used to add a transfer function across game scenes to the first functional component, and through the first setting instruction, a specified game scene may be determined from other game scenes in the game scene combination, for example, the game scene to be edited is the game scene D1, and the user may select the game scene D2 in the game scene combination as the specified game scene, thereby adding a function transferred to the game scene D2 to the first functional component, and implementing association between the first functional component and the specified game scene.
In the game running stage, the designated game scene associated with the first functional component can be determined by triggering the first component object corresponding to the first functional component, and the virtual character is caused to enter the designated game scene. The triggering of the first component object may include, but is not limited to: the virtual character reaches or contacts the first component object, interacts with the first component object using a particular prop or skill, and so on. The first component object can be in a visible or invisible form, such as a transmission door or the like, and can also be an area which is not specially marked in the game scene, so that a user can look as if the area is not different from a conventional area, unexpected game scene jump is triggered when entering the area, and play can be enriched. When the virtual character enters the specified game scene, a load wait screen may be presented while loading a file of the specified game scene.
By the first functional component, the jump between different game scenes in the game scene combination can be realized, compared with the jump mode of jumping to the next game stage after the current game stage is settled, the standby stage settlement is not needed, and the jump between the game scenes can not follow the sequence of the game scene combination, so that the degree of freedom and the diversity of playing methods are increased.
In one embodiment, the game editing method may further include the steps of:
and displaying information of the appointed game scene at a first component object corresponding to the first functional component in a game running stage corresponding to the game scene to be edited.
Wherein the displayed information specifying the game scene may be a cover map, a thumbnail, a scene icon, a scene name, or the like, of the specified game scene, so that the user can intuitively see the transfer destination of the first component object. When the information of the specified game scene is displayed, a certain transparency can be set so as to avoid the information of the specified game scene from forming visual shielding on the current game level (the game level corresponding to the game scene to be edited).
In one embodiment, the information specifying the game scene may be displayed at the first component object corresponding to the first functional component in case that the predetermined condition is satisfied. For example, when the virtual character enters a preset area centered on the first component object, the information of the designated game scene is triggered to be displayed at the first component object, or when any virtual character in the current game stage triggers the first component object, the information of the designated game scene is displayed at the first component object.
Fig. 8A shows a schematic diagram showing information specifying a game scene. In the game running stage corresponding to the game scene to be edited, a first component object 801 corresponding to the first functional component is shown in the graphical user interface, a cover map 802 of the designated game scene is displayed at the first component object 801, and when the user controls the virtual character to be close to the first component object 801, the cover map 802 of the designated game scene can be seen.
In one embodiment, the controlling the virtual character to enter the designated game scene may include the steps of:
displaying a preparation screen for jumping to a designated game scene in a graphical user interface; the preparation picture comprises a game mode selection control;
and responding to the operation of the game mode selection control, and controlling the virtual character to enter the appointed game scene according to the selected game mode.
Wherein the game mode selection control may be used to select a game mode, such as a single player game mode, a multiplayer game mode, a racing mode, a survival mode, and the like. The user may select a mode that the game mode selection control wants to play.
In one embodiment, if a single player mode is selected, the designated game scene may be directly loaded and virtual character entry controlled.
In one embodiment, if the multiplayer game mode is selected, the controlling the virtual character to enter the designated game scene according to the selected game mode may include the following steps:
and adding the user information corresponding to the virtual character into a matching queue of the appointed game scene to match, and controlling the virtual character to enter the appointed game scene when the matching is successful.
The game level corresponding to the designated game scene can be a plurality of game levels, and when the virtual character jumps to the designated game scene by triggering the first component object, the matching queue of the designated game scene can be entered. The system can match other players with similar game grades or similar game areas with the user into a sentence of games according to the user information, and after the matching is successful, the virtual roles are controlled to enter the designated game scene.
In addition to the game mode selection control, other information may be displayed in the preparation screen including, but not limited to: specifying details (or peripheral information) of a game scene, such as scene name, play tag, creator information, scene description, play number, etc.; designating a cover map of the game scene; waiting for timing or progress bar (e.g., loading progress of a specified game scene) information; game matching information, such as in a multiplayer game mode, displays matched user information.
In one embodiment, the preparation screen may be displayed in the form of a sub-interface (e.g., floating window, cover, etc.) in the interface of the current game level. In the case of displaying the preparation screen, the user may be allowed to operate the virtual character, for example, the user moves the virtual character out of the first component object, which is equivalent to canceling the triggering of the first component object, and at this time, the preparation screen may be closed, that is, the user returns to the current game stage, and no longer jumps to the designated game scene.
Fig. 8B shows a schematic diagram of a display preparation screen. When the virtual character triggers into the trigger range of the first component object 801, the first component object 801 is triggered, at which time a preparation screen including a cover chart specifying a game scene (the size of the cover chart may be larger than that of the cover chart 802 in fig. 8A), details specifying the game scene, and a game mode selection control 803, a user can select a single-person game mode or a multi-person game mode through the game mode selection control 803 may be displayed. If the multiplayer game mode is selected, other players need to be matched, and character information such as 'waiting for other warriors to join' can be displayed in the preparation picture.
In one embodiment, information of the virtual character, such as a life value, equipment, prop information, etc., may be saved before the virtual character is controlled to enter the designated game scene, and the saved information may be used along after the virtual character is controlled to enter the designated game scene, so that the virtual character inherits the life value, equipment, prop information, etc. in the previous gate.
In one embodiment, the editing of the game scene to be edited according to the editing instruction in response to the editing instruction of the game scene to be edited may include the steps of:
displaying a game scene to be edited in a graphical user interface;
setting a second functional component in the game scene to be edited;
responding to a second setting instruction aiming at the second functional component, and determining a target position according to the second setting instruction, wherein the target position is a position in a game scene to be edited;
accordingly, the game editing method may further include the steps of:
and in a game running stage corresponding to the game scene to be edited, responding to the trigger of the second component object corresponding to the second functional component, and controlling the virtual character to move from the current position to the target position associated with the second functional component.
The second functional component is also a scene component with a transmission function, and in the game running stage, the second functional component can correspondingly generate a second component object, such as a transmission gate, a transmission point and the like, and has the function of transmitting the virtual character.
When editing the game scene to be edited, the user can add the second functional component in the game scene to be edited, for example, the second functional component can be placed at any position in the game scene to be edited through a selection control of the second functional component. Furthermore, the user can input a second setting instruction for the second functional component, wherein the second setting instruction is used for adding a transmission function in the game scene for the second functional component, and through the second setting instruction, the target position can be determined in the game scene to be edited, and a function for transmitting the transmission function to the target position is added for the second functional component, so that the association between the second functional component and the target position is realized.
In the game running stage, the target position associated with the second functional component can be determined by triggering the second component object corresponding to the second functional component, and the virtual character is transmitted to the target position. The triggering of the second component object may include, but is not limited to: the virtual character reaches or contacts the second component object, interacts with the second component object using a particular prop or skill, and so on. The second component object can be in a visible or invisible form, such as a transmission door or the like, and can also be an area which is not specially marked in the game scene, so that a user can look as if the area is not different from a conventional area, unexpected position transmission is triggered when entering the area, and play can be enriched.
In one embodiment, the first functional component and the second functional component may be the same functional component, such as both being transfer gate components. The user adds the first functional component or the second functional component in the game scene to be edited, and can select to add the functional component, so as to set the transmission function of the cross-game scene or the game scene for the user through the first setting instruction or the second setting instruction. If the user can set a transmission destination for the functional component, if 'another level' is selected, a level selection frame appears to display other game scenes in the game scene component, and the user selects a designated game scene from the game scenes, so that a first setting instruction is completed; if "in the present gate" is selected, the target position may be further set in the game scene to be edited, or other functional components (such as other transfer gates) in the game scene to be edited may be set as the target position, thereby completing the second setting instruction.
In one embodiment, the first functional component and the second functional component may be different functional components, for example, the first functional component may be a (cross-gate) transfer array component and the second functional component may be a (intra-gate) transfer gate component. After the user sets the transmission array component in the game scene to be edited, if the transmission array component is set, only the transmission function of the cross-game scene can be set. After the user sets the transmission door assembly in the game scene to be edited, if the transmission door assembly is set, only the transmission function in the game scene can be set.
The exemplary embodiments of the present disclosure also provide a game editing device. Referring to fig. 9, the game editing device 900 may include the following program modules:
an editing interface display module 910 configured to display a game scene editing interface in a graphical user interface provided by running a game program;
a game scene association module 920 configured to associate at least two game scenes in response to a game scene association instruction in the game scene editing interface, wherein the at least two game scenes are independent game scenes;
a game scene editing module 930 configured to respond to an editing instruction of a game scene to be edited, and edit the game scene to be edited according to the editing instruction; the game scene to be edited is one of the at least two game scenes;
A game scenario combination generation module 940 configured to generate a game scenario combination including the at least two game scenarios; the game scenario combination is configured to provide successive game stages at a game run stage, the successive game stages including a game stage corresponding to each of the game scenarios.
In one embodiment, the at least two game scenes include a first game scene and at least one second game scene; the game scene editing interface comprises an editing interface of a first game scene; and in response to the game scene association instruction in the game scene editing interface, associating at least two game scenes, including:
and responding to an instruction for adding the associated game scene for the first game scene in the editing interface of the first game scene, taking the first game scene as a main game scene, and taking the added second game scene as the associated game scene of the main game scene.
In one embodiment, the above instruction for adding an associated game scene to the first game scene includes: the game scene is newly established as an instruction of the associated game scene.
In one implementation, the game scenario association module 920 is further configured to:
After a new game scene is generated by creating the game scene as an instruction of the associated game scene, generating an initial identification of the new game scene according to the identification of the first game scene and the sequence of the new game scene in the game scene combination.
In one embodiment, the above instruction for adding an associated game scene to the first game scene includes: instructions for selecting an associated game scene from among the existing game scenes.
In one implementation, the game scenario association module 920 is further configured to:
providing an associated scene adding control in response to setting a play method to a preset play method in an editing interface of a first game scene before responding to an instruction for adding the associated game scene to the first game scene in the editing interface of the first game scene;
the instruction for adding the associated game scene for the first game scene is an instruction input through an associated scene adding control.
In one implementation, the game scenario association module 920 is further configured to:
providing a first number of functionality controls in a preview interface of a primary game scene;
providing a second number of functionality controls in a preview interface of the associated game scene; the first number is greater than the second number.
In one embodiment, the game scene editing module 930 is further configured to:
and responding to the issuing instruction aiming at the main game scene, issuing the main game scene and the associated game scene into a map pool of the server, so that the game client can download the main game scene and the associated game scene from the map pool.
In one embodiment, the game scene editing module 930 is further configured to:
in response to a repudiation instruction for the primary game scenario, requesting the server to remove the primary game scenario and associated game scenario from the map pool.
In one embodiment, the responding to the editing instruction of the game scene to be edited, editing the game scene to be edited according to the editing instruction, includes:
responding to an editing triggering instruction of the game scene to be edited, and entering an editing interface of the game scene to be edited;
responding to an editing instruction in an editing interface of the game scene to be edited, and editing the game scene to be edited according to the editing instruction.
In one embodiment, the game editing device 900 may further include a game execution processing module configured to:
responding to a game running instruction, loading and displaying a current game level obtained by running a game scene in the game scene combination;
Responding to a level settlement instruction, and settling accounts according to game behaviors of the virtual character at the current game level;
determining a target game scene according to the sequence in the game scene combination;
and loading and displaying the target game level obtained by running the target game scene.
In one implementation, the game scenario association module 920 is further configured to:
when at least two game scenes are associated, determining the sequence of the game scenes in the game scene combination according to the adding sequence of the game scenes;
the game scene editing module 930 is further configured to:
in response to a sequence editing operation for the game scene, the sequence of the game scene in the game scene combination is updated according to the sequence editing operation.
In one implementation, the game scenario association module 920 is further configured to: after associating at least two game scenes, associated game scene icons and the order of the game scenes in the game scene combination are displayed in a preview interface of the game scenes.
In one embodiment, the game play processing module is further configured to:
responding to a game running instruction, and determining game progress information according to a history clearance record of a game account number participating in a game scene combination;
Determining a current game scene in the game scene combination according to the game progress information;
and loading and displaying the current game level obtained by running the current game scene.
In one embodiment, the responding to the editing instruction of the game scene to be edited, editing the game scene to be edited according to the editing instruction, includes:
displaying a game scene to be edited in a graphical user interface;
setting a first functional component in a game scene to be edited;
determining a designated game scenario from the associated at least two game scenarios in response to a first setting instruction for the first functional component;
the game editing device 900 may further include a game execution processing module configured to:
in a game running stage corresponding to a game scene to be edited, responding to the trigger of a first component object corresponding to a first functional component, and determining a designated game scene associated with the first functional component;
the virtual character is controlled to enter a designated game scene.
In one embodiment, the responding to the editing instruction of the game scene to be edited, editing the game scene to be edited according to the editing instruction, includes:
displaying a game scene to be edited in a graphical user interface;
setting a second functional component in the game scene to be edited;
Responding to a second setting instruction aiming at the second functional component, and determining a target position according to the second setting instruction, wherein the target position is a position in a game scene to be edited;
the game editing device 900 may further include a game execution processing module configured to:
and in a game running stage corresponding to the game scene to be edited, responding to the trigger of the second component object corresponding to the second functional component, and controlling the virtual character to move from the current position to the target position associated with the second functional component.
The specific details of each part in the above apparatus are already described in the method part embodiments, and the details not disclosed can refer to the embodiment content of the method part, so that the details are not repeated.
Exemplary embodiments of the present disclosure also provide a computer readable storage medium, which may be implemented in the form of a program product comprising program code for causing an electronic device to carry out the steps according to the various exemplary embodiments of the disclosure as described in the above section of the "exemplary method" when the program product is run on the electronic device. In an alternative embodiment, the program product may be implemented as a portable compact disc read only memory (CD-ROM) and comprises program code and may run on an electronic device, such as a personal computer. However, the program product of the present disclosure is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium can be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium would include the following: an electrical connection having one or more wires, a portable disk, a hard disk, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The computer readable signal medium may include a data signal propagated in baseband or as part of a carrier wave with readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device, partly on a remote computing device, or entirely on the remote computing device or server. In the case of remote computing devices, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., connected via the Internet using an Internet service provider).
Exemplary embodiments of the present disclosure also provide an electronic device, such as may be the client 110 or the server 120 described above. The electronic device may include a processor and a memory. The memory stores executable instructions of the processor, such as program code. The processor performs the method of the present exemplary embodiment by executing the executable instructions. The electronic device may further comprise a display for displaying the graphical user interface.
An electronic device is illustrated in the form of a general purpose computing device with reference to fig. 10. It should be understood that the electronic device 1000 shown in fig. 10 is merely an example and should not be construed as limiting the functionality and scope of use of embodiments of the present disclosure.
As shown in fig. 10, the electronic device 1000 may include: processor 1010, memory 1020, bus 1030, I/O (input/output) interface 1040, network adapter 1050, and display 1060.
Memory 1020 may include volatile memory such as RAM 1021, cache unit 1022, and may also include nonvolatile memory such as ROM 1023. Memory 1020 may also include one or more program modules 1024, such program modules 1024 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment. For example, program modules 1024 may include modules in the apparatus described above.
The processor 1010 may include one or more processing units, such as: the processor 1010 may include an AP (Application Processor ), modem processor, GPU (Graphics Processing Unit, graphics processor), ISP (Image Signal Processor ), controller, encoder, decoder, DSP (Digital Signal Processor ), baseband processor, and/or NPU (Neural-Network Processing Unit, neural network processor), and the like.
The processor 1010 may be used to execute executable instructions stored in the memory 1020, such as may perform any one or more of the method steps in the present exemplary embodiment.
The bus 1030 is used to enable connections between the various components of the electronic device 1000 and may include a data bus, an address bus, and a control bus.
The electronic device 1000 can communicate with one or more external devices 1100 (e.g., keyboard, mouse, external controller, etc.) through the I/O interface 1040.
Electronic device 1000 can communicate with one or more networks through network adapter 1050, e.g., network adapter 1050 can provide a mobile communication solution such as 3G/4G/5G, or a wireless communication solution such as wireless local area network, bluetooth, near field communication, etc. Network adapter 1050 can communicate with other modules of electronic device 1000 via bus 1030.
The electronic device 1000 may display a graphical user interface, such as a game scene editing interface, etc., via the display 1060.
Although not shown in fig. 10, other hardware and/or software modules may also be provided in the electronic device 1000, including, but not limited to: microcode, device drivers, redundant processors, external disk drive arrays, RAID systems, tape drives, data backup storage systems, and the like.
It should be noted that although in the above detailed description several modules or units of a device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit in accordance with exemplary embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into a plurality of modules or units to be embodied.
Those skilled in the art will appreciate that the various aspects of the present disclosure may be implemented as a system, method, or program product. Accordingly, various aspects of the disclosure may be embodied in the following forms, namely: an entirely hardware embodiment, an entirely software embodiment (including firmware, micro-code, etc.) or an embodiment combining hardware and software aspects may be referred to herein as a "circuit," module "or" system. Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (19)

1. A game editing method, the method comprising:
displaying a game scene editing interface in a graphical user interface provided by running a game program;
responding to game scene association instructions in the game scene editing interface, and associating at least two game scenes, wherein the at least two game scenes are independent game scenes;
responding to an editing instruction of a game scene to be edited, and editing the game scene to be edited according to the editing instruction; the game scene to be edited is a game scene in the at least two game scenes;
generating a game scene combination comprising the at least two game scenes; the game scenario combination is configured to provide successive game stages at a game run stage, the successive game stages including a game stage corresponding to each of the game scenarios.
2. The method of claim 1, wherein the at least two game scenes comprise a first game scene and at least one second game scene; the game scene editing interface comprises an editing interface of the first game scene; the responding to the game scene association instruction in the game scene editing interface associates at least two game scenes, and the method comprises the following steps:
And responding to an instruction for adding an associated game scene for the first game scene in an editing interface of the first game scene, taking the first game scene as a main game scene, and taking the added second game scene as the associated game scene of the main game scene.
3. The method of claim 2, wherein the instructions to add an associated game scene to the first game scene comprise: the game scene is newly established as an instruction of the associated game scene.
4. A method according to claim 3, characterized in that the method further comprises:
after a new game scene is generated by taking the newly-built game scene as an instruction of an associated game scene, generating an initial identification of the new game scene according to the identification of the first game scene and the sequence of the new game scene in the game scene combination.
5. The method of claim 2, wherein the instructions to add an associated game scene to the first game scene comprise: instructions for selecting an associated game scene from among the existing game scenes.
6. The method of claim 2, wherein prior to responding to the instruction to add an associated game scene for the first game scene in the editing interface of the first game scene, the method further comprises:
Responding to setting the playing method as a preset playing method in an editing interface of the first game scene, and providing an associated scene adding control;
the instruction for adding the associated game scene for the first game scene is an instruction input through the associated scene adding control.
7. The method according to claim 2, wherein the method further comprises:
providing a first number of functionality controls in a preview interface of the primary game scene;
providing a second number of functionality controls in a preview interface of the associated game scene; the first number is greater than the second number.
8. The method according to claim 2, wherein the method further comprises:
and responding to an issuing instruction aiming at the main game scene, and issuing the main game scene and the associated game scene into a map pool of a server, so that a game client can download the main game scene and the associated game scene from the map pool.
9. The method of claim 8, wherein the method further comprises:
and responding to a de-issue instruction for the main game scene, and requesting a server to remove the main game scene and the associated game scene from the map pool.
10. The method of claim 1, wherein the responding to the editing instruction of the game scene to be edited and editing the game scene to be edited according to the editing instruction comprises:
responding to an editing triggering instruction of the game scene to be edited, and entering an editing interface of the game scene to be edited;
responding to an editing instruction in an editing interface of the game scene to be edited, and editing the game scene to be edited according to the editing instruction.
11. The method according to claim 1, wherein the method further comprises:
responding to a game running instruction, loading and displaying a current game level obtained by running a game scene in the game scene combination;
responding to a level settlement instruction, and settling accounts according to game behaviors of the virtual character at the current game level;
determining a target game scene according to the sequence in the game scene combination;
and loading and displaying a target game stage obtained by running the target game scene.
12. The method of claim 11, wherein the method further comprises:
determining the sequence of the game scenes in the game scene combination according to the adding sequence of the game scenes when the at least two game scenes are associated;
And in response to a sequence editing operation for the game scene, updating the sequence of the game scene in the game scene combination according to the sequence editing operation.
13. The method of claim 11, wherein after associating at least two game scenes, the method further comprises:
and displaying associated game scene icons and the sequence of the game scenes in the game scene combination in a preview interface of the game scenes.
14. The method of claim 11, wherein loading and displaying a current game level obtained by running a game scene of the combination of game scenes in response to a game running instruction comprises:
responding to a game running instruction, and determining game progress information according to a history clearance record of a game account number participating in a game in the game scene combination;
determining a current game scene in the game scene combination according to the game progress information;
and loading and displaying the current game level obtained by running the current game scene.
15. The method of claim 1, wherein the responding to the editing instruction of the game scene to be edited and editing the game scene to be edited according to the editing instruction comprises:
Displaying the game scene to be edited in the graphical user interface;
setting a first functional component in the game scene to be edited;
determining a designated game scenario from the associated at least two game scenarios in response to a first setting instruction for the first functional component;
the method further comprises the steps of:
in a game running stage corresponding to the game scene to be edited, responding to the trigger of a first component object corresponding to the first functional component, and determining the designated game scene associated with the first functional component;
and controlling the virtual character to enter the appointed game scene.
16. The method of claim 1, wherein the responding to the editing instruction of the game scene to be edited and editing the game scene to be edited according to the editing instruction comprises:
displaying the game scene to be edited in the graphical user interface;
setting a second functional component in the game scene to be edited;
responding to a second setting instruction aiming at the second functional component, and determining a target position according to the second setting instruction, wherein the target position is the position in the game scene to be edited;
The method further comprises the steps of:
and in a game running stage corresponding to the game scene to be edited, responding to the trigger aiming at the second component object corresponding to the second functional component, and controlling the virtual character to move from the current position to the target position associated with the second functional component.
17. A game editing device, the device comprising:
the editing interface display module is configured to display a game scene editing interface in a graphical user interface provided by running a game program;
a game scene association module configured to associate at least two game scenes in response to a game scene association instruction in the game scene editing interface, wherein the game scenes are independent of each other;
the game scene editing module is configured to respond to an editing instruction of a game scene to be edited, and edit the game scene to be edited according to the editing instruction; the game scene to be edited is a game scene in the at least two game scenes;
a game scene combination generation module configured to generate a game scene combination including the at least two game scenes; the game scenario combination is configured to provide successive game stages at a game run stage, the successive game stages including a game stage corresponding to each of the game scenarios.
18. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the method of any one of claims 1 to 16.
19. An electronic device, comprising:
a processor;
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the method of any one of claims 1 to 16 via execution of the executable instructions.
CN202311315514.1A 2023-10-10 2023-10-10 Game editing method, game editing device, storage medium and electronic apparatus Pending CN117282106A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311315514.1A CN117282106A (en) 2023-10-10 2023-10-10 Game editing method, game editing device, storage medium and electronic apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311315514.1A CN117282106A (en) 2023-10-10 2023-10-10 Game editing method, game editing device, storage medium and electronic apparatus

Publications (1)

Publication Number Publication Date
CN117282106A true CN117282106A (en) 2023-12-26

Family

ID=89251622

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311315514.1A Pending CN117282106A (en) 2023-10-10 2023-10-10 Game editing method, game editing device, storage medium and electronic apparatus

Country Status (1)

Country Link
CN (1) CN117282106A (en)

Similar Documents

Publication Publication Date Title
CN111294663B (en) Bullet screen processing method and device, electronic equipment and computer readable storage medium
US7904812B2 (en) Browseable narrative architecture system and method
US7303476B2 (en) Method and apparatus for creating and playing soundtracks in a gaming system
US20090150760A1 (en) Creating publications using game-based media content
TW201005583A (en) Interactive systems and methods for video compositing
JP2016189804A (en) Server system
US20150026573A1 (en) Media Editing and Playing System and Method Thereof
KR20060030036A (en) System and method of interactive video playback
US20200402537A1 (en) Method and device of presenting audio/video files, computing device, and readable storage medium
US20200306651A1 (en) Server system and play data community system
de Lima et al. Video-based interactive storytelling using real-time video compositing techniques
JP7328459B2 (en) Game console application with action card strands
CN117085335A (en) Game editing method, game editing device, storage medium and electronic apparatus
US20040139481A1 (en) Browseable narrative architecture system and method
US20230350554A1 (en) Position marking method, apparatus, and device in virtual scene, storage medium, and program product
KR20110074191A (en) Apparatus and method for issuing quest
CN117282106A (en) Game editing method, game editing device, storage medium and electronic apparatus
US20210252412A1 (en) Video game page providing information and functionalities based on video game lifecycle and user context
JP5700521B2 (en) Execution file to create a video work file by editing a video of your choice while watching a template video on the user's computer, and how to use it
Stuckey Special effects and spectacle: integration of CGI in contemporary Chinese film
CN112789090A (en) Data management and performance tracking system for walkable or interactive virtual reality
CN117101147A (en) Game editing method, game editing device, storage medium and electronic apparatus
CN115396685B (en) Live interaction method and device, readable storage medium and electronic equipment
CN117258302A (en) Game editing method, game editing device, storage medium and electronic apparatus
CN117717785A (en) Game scene component method, device, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination