CN111124402B - Method, device, terminal equipment and storage medium for editing multimedia in game - Google Patents

Method, device, terminal equipment and storage medium for editing multimedia in game Download PDF

Info

Publication number
CN111124402B
CN111124402B CN201911114316.2A CN201911114316A CN111124402B CN 111124402 B CN111124402 B CN 111124402B CN 201911114316 A CN201911114316 A CN 201911114316A CN 111124402 B CN111124402 B CN 111124402B
Authority
CN
China
Prior art keywords
multimedia
game
selection control
editing
interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911114316.2A
Other languages
Chinese (zh)
Other versions
CN111124402A (en
Inventor
罗青
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN201911114316.2A priority Critical patent/CN111124402B/en
Publication of CN111124402A publication Critical patent/CN111124402A/en
Application granted granted Critical
Publication of CN111124402B publication Critical patent/CN111124402B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/30Creation or generation of source code
    • G06F8/38Creation or generation of source code for implementing user interfaces
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor

Abstract

The application provides a method, a device, terminal equipment and a storage medium for editing multimedia in a game, and relates to the technical field of software games. According to the embodiment of the application, a multimedia editing interface can be provided on a graphical user interface, the multimedia editing interface is controlled to display at least one element selection control, wherein different element selection controls are associated with different time frames, the multimedia elements corresponding to the element selection control are determined by responding to the selection operation of the element selection control, and corresponding multimedia data are generated according to the multimedia elements and the time frames corresponding to the multimedia elements, so that the multimedia elements can be added on the controls corresponding to the different time frames through simple selection operation, and the multimedia data are generated, thereby greatly reducing the operation difficulty of multimedia editing in games, and realizing multimedia editing on a touch terminal more easily.

Description

Method, device, terminal equipment and storage medium for editing multimedia in game
The present application is a divisional application of chinese patent application No. 201910995729.X, filed on 10 months 18 in 2019, which is incorporated by reference in this divisional application.
Technical Field
The present application relates to the technical field of software games, and in particular, to a method and apparatus for editing multimedia in a game, a terminal device, and a storage medium.
Background
The cutscene is often applied to various games such as end game, hand game, host game and the like, and shows the scenario to the game player in the form of playing the animation video by being similar to a movie, so that a strong game substitution sense can be provided for the game player, and the fatigue of the game level of the player can be relieved. However, the production of the cutscene usually requires professional game staff to use a professional editing tool, and the production of the cutscene can only be achieved by controlling the camera motion, character walking, character motion, character dialogue and other display elements in a time axis, so that a common game player cannot participate in or complete the production of the cutscene.
In the prior art, in order to meet the enthusiasm of game players for creating cutscene, some online game operators improve the game experience of the game players and provide a cutscene editing tool for the game players, so that the game players can make the cutscene in the game through the cutscene editing tool. For example, in some animation editing tools provided for online games of personal computer (Personal Computer, PC) platforms, a timeline editing interface may be provided for a game player, on which the game player can edit a cut scene element in a time dimension by controlling a plurality of camera switches, parameters, sound effects, and the like of a single camera, with the timeline as a main line.
However, the time axis operation interface of the conventional cutscene editing tool is still complex, and the operation difficulty is still high when the game player edits the cutscene.
Disclosure of Invention
The application provides a method, a device, terminal equipment and a storage medium for editing multimedia in a game, which can reduce the operation difficulty of a game player when editing a cutscene.
In a first aspect, an embodiment of the present application provides a method for editing multimedia in a game, where a graphical user interface is provided through a terminal device, the method including:
providing a multimedia editing interface on the graphic user interface;
controlling the multimedia editing interface to display at least one element selection control, wherein different element selection controls are associated with different time frames;
responding to the selection operation of the element selection control, and determining the multimedia element corresponding to the element selection control;
corresponding multimedia data is generated according to the multimedia elements and the time frames corresponding to the multimedia elements.
Optionally, the providing a multimedia editing interface on the graphical user interface includes:
and responding to the editing instruction, and displaying a multimedia editing interface through a graphical user interface.
Optionally, the graphical user interface includes a game screen including a first virtual object, and the responding to the editing instruction includes:
and responding to the triggering operation of the first virtual object.
Optionally, the multimedia editing interface includes an item selection control, and the controlling the multimedia editing interface to display at least one element selection control includes:
and responding to the selection operation of the item selection control, and controlling the multimedia editing interface to display the object to be edited and at least one element selection control corresponding to the object to be edited.
Optionally, the controlling the multimedia editing interface to display at least one element selection control includes:
and controlling the horizontal axis direction of the multimedia editing interface to display a plurality of time frames arranged in time sequence, and displaying at least one corresponding element selection control in the vertical axis direction of each time frame.
Optionally, the determining, in response to the selection operation of the element selection control, the multimedia element corresponding to the element selection control includes:
in response to a selection operation of the element selection control, displaying an element submenu corresponding to the element selection control, the element submenu comprising: at least one element selection;
And responding to the selection operation of the element submenu, and acquiring the multimedia element corresponding to each element selection control.
Optionally, the responding to the selection operation of the element submenu obtains the multimedia element corresponding to each element selection control, including:
responding to the selection operation of the element submenu, and displaying a corresponding element editing interface;
and responding to the editing parameters of the element editing interface, and acquiring the multimedia elements corresponding to the element selection controls.
Optionally, the element submenu includes one or more of the following: a, a movement, a motion, a speaking and an expression; when the element submenu contains any one of a field, transition, departure, movement, the element editing interface includes: and moving the coordinate information editing interface.
Optionally, the coordinate points of the mobile coordinate information editing interface correspond to the coordinate points of the corresponding positions in the multimedia presentation interface in the game interface; editing parameters of the element editing interface include: and selecting the end position of the virtual character.
Optionally, the element editing interface includes m×n virtual selection lattices, each lattice corresponds to a coordinate point at a corresponding position in the multimedia presentation interface in the game interface, where m and n are integers greater than 0; the obtaining the multimedia element corresponding to each element selection control in response to the editing parameter of the element editing interface includes:
And responding to the selection operation of the target grid in the element editing interface, and acquiring the end position information of the multimedia element corresponding to the element selection control.
Optionally, the object to be edited includes: basic materials and/or virtual roles; the element selection control corresponding to the basic material comprises one or more of the following: scene selection control, background selection control, music selection control, and sound effect selection control.
Optionally, the controlling the multimedia editing interface to display at least one element selection control includes:
and controlling the vertical axis direction of the multimedia editing interface to display the object to be edited, and the horizontal axis direction to display a plurality of time frames arranged in time sequence, wherein the vertical axis direction of each time frame displays at least one element selection control corresponding to each object to be edited so as to associate the element selection controls with different time frames.
Optionally, after generating the corresponding multimedia data according to the multimedia element and the time frame corresponding to the multimedia element, the method further includes:
and responding to the selection operation of the preview mode selection control, and playing the multimedia data, wherein a playing interface of the multimedia data comprises time frame information.
Optionally, after generating the corresponding multimedia data according to the multimedia element and the time frame corresponding to the multimedia element, the method further includes:
Responding to the selection operation of the frame selection mode selection control, and playing the multimedia data, wherein a playing interface of the multimedia data comprises a time frame selection virtual control;
and responding to the dragging operation of the virtual control selected by the time frame, and jumping the multimedia data to the corresponding time frame for playing.
Optionally, the method further comprises:
in response to a triggering operation for multimedia viewing, displaying a multimedia data playlist, the multimedia data playlist comprising: at least one identifier of the multimedia data and a corresponding playing virtual control;
and playing the target multimedia data in response to the selection operation of the playing virtual control of the target multimedia data in the multimedia data play list.
Optionally, the multimedia data playlist further comprises: supporting virtual controls corresponding to the identifications of the multimedia data; the method further comprises the steps of:
and updating the support data of the target multimedia data in response to the selection operation of the support virtual control of the target multimedia data in the multimedia data play list.
In a second aspect, an embodiment of the present application provides a multimedia editing apparatus in a game, providing a graphical user interface through a terminal device, the apparatus comprising: the interface providing module is used for providing a multimedia editing interface on the graphical user interface; the control display module is used for controlling the multimedia editing interface to display at least one element selection control, wherein different element selection controls are associated with different time frames; the response module is used for responding to the selection operation of the element selection control and determining the multimedia element corresponding to the element selection control; and the generation module is used for generating corresponding multimedia data according to the multimedia elements and the time frames corresponding to the multimedia elements.
Optionally, the interface providing module is specifically configured to display a multimedia editing interface through the graphical user interface in response to an editing instruction.
Optionally, the graphical user interface includes a game screen, the game screen includes a first virtual object, and the interface providing module is specifically configured to respond to a triggering operation on the first virtual object.
Optionally, the multimedia editing interface includes an item selection control, and the control display module is specifically configured to control the multimedia editing interface to display an object to be edited and at least one element selection control corresponding to the object to be edited in response to a selection operation of the item selection control.
Optionally, the control display module is specifically configured to control the multimedia editing interface to display a plurality of time frames arranged in time sequence in a horizontal axis direction, and display at least one corresponding element selection control in a vertical axis direction of each time frame.
Optionally, the response module includes: the first response sub-module is used for responding to the selection operation of the element selection control, displaying an element sub-menu corresponding to the element selection control, wherein the element sub-menu comprises: at least one element selection; and the second response sub-module is used for responding to the selection operation of the element sub-menu and acquiring the multimedia elements corresponding to the element selection controls.
Optionally, the second response sub-module is specifically configured to display a corresponding element editing interface in response to a selection operation of the element sub-menu; and responding to the editing parameters of the element editing interface, and acquiring the multimedia elements corresponding to the element selection controls.
Optionally, the element submenu includes one or more of the following: a, a movement, a motion, a speaking and an expression; when the element submenu contains any one of a field, transition, departure, movement, the element editing interface includes: and moving the coordinate information editing interface.
Optionally, the coordinate points of the mobile coordinate information editing interface correspond to the coordinate points of the corresponding positions in the multimedia presentation interface in the game interface; editing parameters of the element editing interface include: and selecting the end position of the virtual character.
Optionally, the element editing interface includes m×n virtual selection lattices, each lattice corresponds to a coordinate point at a corresponding position in the multimedia presentation interface in the game interface, where m and n are integers greater than 0; the second response sub-module is specifically configured to obtain end position information of a multimedia element corresponding to the element selection control in response to a selection operation of the target grid in the element editing interface.
Optionally, the object to be edited includes: basic materials and/or virtual roles; the element selection control corresponding to the basic material comprises one or more of the following: scene selection control, background selection control, music selection control, and sound effect selection control.
Optionally, the control display module is specifically configured to control the multimedia editing interface to display an object to be edited in a vertical axis direction, display a plurality of time frames arranged in time sequence in a horizontal axis direction, and display at least one element selection control corresponding to each object to be edited in the vertical axis direction of each time frame, so as to associate the element selection controls with different time frames.
Optionally, the apparatus further comprises: and the preview playing module is used for responding to the selection operation of the preview mode selection control and playing the multimedia data, wherein a playing interface of the multimedia data comprises time frame information.
Optionally, the apparatus further comprises: the frame selection playing module is used for responding to the selection operation of the frame selection mode selection control and playing the multimedia data, wherein a playing interface of the multimedia data comprises a time frame selection virtual control; and the method is used for responding to the dragging operation of selecting the virtual control for the time frame, and the multimedia data jumps to the corresponding time frame for playing.
Optionally, the apparatus further comprises: a multimedia viewing module for displaying a multimedia data playlist in response to a triggering operation for multimedia viewing, the multimedia data playlist comprising: at least one identifier of the multimedia data and a corresponding playing virtual control; and the virtual control is used for responding to the selection operation of the playing virtual control of the target multimedia data in the multimedia data play list, and playing the target multimedia data.
Optionally, the multimedia data playlist further comprises: supporting virtual controls corresponding to the identifications of the multimedia data; the multimedia viewing module is further configured to update the support data for the target multimedia data in response to a selection operation of the support virtual control for the target multimedia data in the multimedia data playlist.
According to the embodiment of the application, a multimedia editing interface can be provided on a graphical user interface, the multimedia editing interface is controlled to display at least one element selection control, wherein different element selection controls are associated with different time frames, the multimedia elements corresponding to the element selection control are determined by responding to the selection operation of the element selection control, and corresponding multimedia data are generated according to the multimedia elements and the time frames corresponding to the multimedia elements, so that the multimedia elements can be added on the controls corresponding to the different time frames through simple selection operation, and the multimedia data are generated, thereby greatly reducing the operation difficulty of multimedia editing in games, and realizing multimedia editing on a touch terminal more easily.
In a fourth aspect, an embodiment of the present application provides a game data processing method, where a graphical user interface is provided by a terminal device, where the graphical user interface includes a game screen, and the game screen includes at least a part of a game scene and a virtual object, and the method includes:
determining a specific scene area in the game scene;
acquiring multimedia data, wherein the multimedia data comprises virtual object information and performance parameters;
and rendering a virtual character corresponding to the virtual object information in the specific scene area according to the performance parameters, and controlling the virtual character to perform.
Optionally, the multimedia data includes at least 2 pieces of multimedia sub data, and the multimedia sub data includes data edited and uploaded by the player.
Optionally, the rendering the virtual character corresponding to the virtual object information in the specific scene area according to the performance parameter, and controlling the virtual character to perform, including:
and rendering virtual roles corresponding to the virtual object information in the specific scene area according to the priority order corresponding to the preset conditions in sequence according to the performance parameters of at least 2 pieces of multimedia sub-data, and controlling the virtual roles to perform.
In a fifth aspect, an embodiment of the present application provides a game data processing method, where a graphical user interface is provided by a terminal device, where the graphical user interface includes a game screen, and the game screen includes at least a part of a game scene and a virtual object, and the method includes:
determining a specific scene area in the game scene, the specific scene area containing the virtual objects performing according to performance parameters;
and responding to the current position information of the player virtual character, and controlling the viewing angle of the specific scene area to be converted into the angle corresponding to the current position information.
Optionally, the preset range of the specific scene area includes: a plurality of position selection controls;
the controlling, in response to current position information of the player avatar, a viewing angle of the specific scene area to be converted to an angle corresponding to the current position information includes:
and in response to the movement of the virtual character of the player to the position of the target position selection control, controlling the viewing angle of the specific scene area to be converted into the viewing angle corresponding to the target position selection control.
Optionally, after the determining the specific scene area in the game scene, the method further includes:
Displaying the interactive material control of the virtual object, wherein the interactive material control comprises one or more of the following: role dialogue control, scenario control and role control;
and responding to the clicking operation of the interactive material control, and switching the corresponding performance scenario.
In a sixth aspect, an embodiment of the present application provides a game data processing method, where a graphical user interface is provided by a terminal device, where the graphical user interface includes a game screen, and the game screen includes at least a part of a game scene and a virtual object, and the method includes:
determining a specific scene area in the game scene, the specific scene area containing the virtual objects performing according to performance parameters;
in response to a selection operation of a particular virtual control, a current display interface is switched from the game scene to a multimedia presentation interface that includes the virtual object performing according to performance parameters.
Optionally, the multimedia presentation interface comprises a multimedia data playlist, the multimedia data playlist comprising: at least one identifier of the multimedia data and a corresponding playing virtual control;
and responding to the selection operation of the playing virtual control of the target multimedia data in the multimedia data play list, and playing the target multimedia data.
In a seventh aspect, an embodiment of the present application provides a terminal device, including: a processor, a storage medium, and a bus, the storage medium storing machine-readable instructions executable by the processor, the processor and the storage medium communicating over the bus when the terminal device is operating, the processor executing the machine-readable instructions to perform a method as described in any one of the first or third to sixth aspects.
In an eighth aspect, an embodiment of the present application provides a storage medium having stored thereon a computer program which, when executed by a processor, performs a method according to any of the first or third to sixth aspects.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic flow chart of a method for editing multimedia in a game according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a multimedia editing interface according to an embodiment of the present application;
FIG. 3 is another schematic diagram of a multimedia editing interface provided by an embodiment of the present application;
FIG. 4 is a schematic diagram of a multimedia editing interface after determining multimedia elements according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a game screen provided by an embodiment of the present application;
FIG. 6 illustrates a schematic diagram of an NPC session provided by an embodiment of this application;
FIG. 7 is a schematic diagram of a multimedia editing interface according to an embodiment of the present application;
FIG. 8 is a schematic diagram of a multimedia editing interface according to an embodiment of the present application;
FIG. 9 is another flow chart of a method for editing multimedia in a game according to an embodiment of the present application;
FIG. 10 is a schematic diagram of an element submenu provided by an embodiment of the present application;
FIG. 11 is another schematic diagram of a multimedia editing interface provided by an embodiment of the present application after determining multimedia elements;
FIG. 12 is a schematic flow chart of a method for editing multimedia in a game according to an embodiment of the present application;
FIG. 13 is a schematic diagram showing a mobile coordinate information editing interface in a game provided by an embodiment of the present application;
FIG. 14 is a schematic view of a multimedia editing interface according to an embodiment of the present application;
FIG. 15 is a schematic diagram of a preview mode game interface provided by an embodiment of the present application;
FIG. 16 is a schematic flow chart of a method for editing multimedia in a game according to an embodiment of the present application;
FIG. 17 is a schematic diagram of a frame-mode game interface provided by an embodiment of the present application;
FIG. 18 is a schematic flow chart of a method for editing multimedia in a game according to an embodiment of the present application;
fig. 19 is a schematic flow chart of a multimedia data playlist according to an embodiment of the present application;
fig. 20 is a schematic diagram showing the structure of a multimedia editing apparatus in a game according to an embodiment of the present application;
fig. 21 is a schematic diagram showing another configuration of a multimedia editing apparatus in a game according to an embodiment of the present application;
fig. 22 is a schematic diagram showing still another configuration of a multimedia editing apparatus in a game according to an embodiment of the present application;
fig. 23 is a schematic view showing still another configuration of a multimedia editing apparatus in a game provided by an embodiment of the present application;
fig. 24 is a schematic view showing still another configuration of a multimedia editing apparatus in a game provided by an embodiment of the present application;
Fig. 25 shows a schematic structural diagram of a terminal device according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more apparent, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments of the present application. The components of the embodiments of the present application generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the application, as presented in the figures, is not intended to limit the scope of the application, as claimed, but is merely representative of selected embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures. In the description of the present application, it should be noted that the terms "first," "second," "third," and the like are used merely to distinguish between descriptions and are not to be construed as indicating or implying relative importance.
The application provides a multimedia editing method in a game, which can be applied to a terminal device. The terminal device may be a local terminal device or a server, where when the terminal device is a server, in an optional implementation manner, the game is a cloud game.
In an alternative embodiment, cloud gaming refers to a game style based on cloud computing. In the running mode of the cloud game, a running main body of the game program and a game picture presentation main body are separated, the storage and running of the game data processing method are completed on a cloud game server, and the cloud game client is used for receiving and sending data and presenting the game picture, for example, the cloud game client can be a display device with a data transmission function, such as a mobile terminal, a television, a computer, a palm computer and the like, which is close to a user side; the terminal device for processing game data is a cloud game server in the cloud. When playing a game, a player operates the cloud game client to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, codes and compresses data such as game pictures and the like, returns the data to the cloud game client through a network, and finally decodes the data through the cloud game client and outputs the game pictures.
In an alternative embodiment, the terminal device may be a local terminal device. The local terminal device stores a game program and is used for presenting game pictures. The local terminal device is used for interacting with the player through the graphical user interface, namely, conventionally downloading and installing the game program through the electronic device and running. The manner in which the local terminal device provides the graphical user interface to the player may include, for example, rendering a display on the terminal, or providing the graphical user interface to the player in projection (either two-dimensional planar projection or three-dimensional stereoscopic projection) via an output device (e.g., projection device) of the local terminal device. For example, the local terminal device may include a display screen for presenting a graphical user interface including game visuals, and a processor for running the game, generating the graphical user interface, and controlling the display of the graphical user interface on the display screen. Alternatively, the local terminal may not include a display screen, but may perform two-dimensional plane or three-dimensional stereoscopic display of the graphical user interface in real space in a projection manner, and receive, through a sensing device provided by the terminal device or the third party device, an operation of the graphical user interface by the player, so as to implement interaction between the terminal device and the player.
The method for editing the multimedia in the game provided by the embodiment of the application is applied to the terminal equipment, so that the operation difficulty of a game player in the process of editing the cutscene can be reduced.
Fig. 1 is a schematic flow chart of a method for editing multimedia in a game according to an embodiment of the present application.
As shown in fig. 1, the method of editing multimedia in a game may include:
s101, providing a multimedia editing interface on the graphical user interface.
As described above, the graphical user interface may be a game interface, for example, when the terminal device loads a game, the game may be provided to the user, where the terminal device may be the aforementioned local terminal device, or may be the aforementioned cloud game client. A multimedia editing interface may be provided in the graphical user interface.
S102, controlling the multimedia editing interface to display at least one element selection control.
Wherein different element selection controls are associated with different time frames.
Optionally, at least one element selection control may be displayed in the multimedia editing interface, each element selection control may be associated with a different time frame. For example, if element selection control 1, element selection control 2, and element selection control 3 are present, the time frame may be a plurality of frames arranged in time sequence during the multimedia presentation, e.g., element selection control 1 may be associated with a first frame, element selection control 2 may be associated with a second frame, and element selection control 3 may be associated with a third frame. Of course, a time frame may be associated with multiple element selection controls, without specific limitation herein.
Fig. 2 is a schematic diagram of a multimedia editing interface provided by an embodiment of the present application, where, as shown in fig. 2, at least one element selection control 210 may be displayed in the multimedia editing interface, and when there are a plurality of element selection controls 210, the plurality of element selection controls 210 may be associated with different time frames, for example: the time frames respectively corresponding to the plurality of element selection controls 210 shown in fig. 2 may be sequentially ordered in the horizontal direction, that is: the first element selection control 210 may be associated with a first frame, the second element selection control 210 may be associated with a second frame, and so on, wherein the multimedia element added through the first element selection control 210 is played in the first frame after generating the multimedia data in the future.
S103, responding to the selection operation of the element selection control, and determining the multimedia element corresponding to the element selection control.
Optionally, when a user (such as a game player) performs a selection operation on the element selection control, the multimedia editing interface may present the selectable multimedia element identifier corresponding to the element selection control for the user.
Fig. 3 is another schematic diagram of a multimedia editing interface provided by an embodiment of the present application, where, as shown in fig. 3, when a player selects a third element selection control from left to right in a first row, the multimedia editing interface may present a multimedia element corresponding to the third element selection control shown in fig. 3, for example: a refund, a movement, a speaking, an expression, etc.
Alternatively, the selection operation of the element selection control may be a touch click operation, a sliding operation, a long press operation, or the like performed on the touch screen, which is not limited herein.
The player can select the multimedia element corresponding to the element selection control presented by the multimedia editing interface, and the multimedia element is added on the time frame corresponding to each element selection control in the multimedia editing interface. For example, fig. 4 is a schematic diagram of a multimedia editing interface provided by the embodiment of the present application after determining a multimedia element, and when a user adds the multimedia element in the multimedia editing interface, the multimedia editing interface may be as shown in fig. 4.
Optionally, in the embodiment of the present application, each multimedia element has a corresponding time frame, and the time frames corresponding to different multimedia elements may be the same or different. After the multimedia element corresponding to the element selection control is determined in response to the selection operation of the element selection control, the frame length corresponding to the multimedia element can be obtained according to the determined multimedia element. For different multimedia elements, the corresponding frame length can be a fixed frame length which is preset, for example, one martial arts action occupies one frame; the frame length may be calculated according to editing parameters specifically selected by the player, for example, a sentence of dialogue for selecting and editing the virtual character by the player, and then the frame length may be calculated according to the dialogue length and the preset speech rate. When the multimedia elements are added to the time frames corresponding to each element selection control in the multimedia editing interface, one or more time frames can be automatically and adaptively occupied according to the frame lengths of different multimedia elements.
For example, if the frame length of the field is 1 frame and the frame length of the movement is 2 frames in the multimedia element shown in fig. 4, it is possible to automatically adapt to adding the field to the first frame, adding the movement to the second frame and the third frame, and so on.
And S104, generating corresponding multimedia data according to the multimedia elements and the time frames corresponding to the multimedia elements.
After the steps S101, S102, and S103 are completed, corresponding multimedia data may be generated according to the multimedia elements added in the multimedia editing interface and the time frames corresponding to the multimedia elements.
Taking the foregoing example shown in fig. 4, the generated multimedia data is: "get-move-speak (good to the good) -get-off" may be performed by a default virtual character in the generated multimedia data, or may be performed by a user selecting to add a virtual character, which is not particularly limited herein.
In the embodiment of the application, a multimedia editing interface can be provided on the graphical user interface, and the multimedia editing interface is controlled to display at least one element selection control, wherein different element selection controls are associated with different time frames, the multimedia elements corresponding to the element selection control are determined by responding to the selection operation of the element selection control, and corresponding multimedia data are generated according to the multimedia elements and the time frames corresponding to the multimedia elements, so that the multimedia elements can be added on the controls corresponding to the different time frames through simple selection operation, and further the multimedia data are generated, thereby greatly reducing the operation difficulty of multimedia editing in games, and being easier to realize multimedia editing on the touch terminal.
Optionally, the providing a multimedia editing interface on the graphical user interface may include:
and responding to the editing instruction, and displaying a multimedia editing interface through a graphical user interface.
The editing instruction may be a click operation of a specific position, a specific virtual object or a specific control in the user graphical user interface, or a user may control a specific position in the graphical user interface to move to a specific position that can trigger the multimedia editing interface, etc.
In one embodiment, the graphical user interface may include a game screen including a first virtual object, and the responding to the editing instruction may include:
and responding to the triggering operation of the first virtual object.
For example, the first virtual object may be a specific area in the game, the triggering operation may be that the virtual character of the player moves to some specific area in the game map, and then the multimedia editing interface may be triggered. Or, the first virtual object may also be a virtual control provided for the player in the game interface, and the triggering operation may be that the player clicks the virtual control, so that the multimedia editing interface may be opened. Or, the first virtual object may also be a Non-Player character (NPC) in the game, where the opening mode of the multimedia editing interface may be provided in the game through an NPC session, that is, the triggering operation may be that the Player controls the virtual character to move to within a preset distance around the NPC, and the multimedia editing interface; or the player controls the virtual character to move to a preset distance near the NPC, the NPC dialogue is displayed on the game interface, and the player enters the multimedia editing interface through the selection of the control contained in the NPC dialogue.
Fig. 5 shows a schematic diagram of a game screen provided by an embodiment of the present application.
As shown in FIG. 5, in the game screen, "the hugger 510" can be an NPC character in the game, and "the Situ Jiao warm 520" can be a virtual character controlled by the player, and the player can trigger the NPC dialogue with the "hugger 510" by controlling the "Situ Jiao warm 520" to move within the preset distance of the "hugger 510".
Fig. 6 shows a schematic diagram of an NPC session provided by an embodiment of the present application, and as shown in fig. 6, the NPC session may be as follows: 1) Starting a row play; 2) How to play drama; 3) And (3) a bulletin. Wherein, 1) starting to play, 2) how to play, 3) the advertisement are selectable controls, when the player clicks "start to play", the multimedia editing interface can be accessed; when a player clicks on how to play drama, a use instruction of the multimedia editing interface can be displayed; when the player clicks on "thesaurus," the NPC session may be exited and the original game interface returned.
Fig. 7 is a schematic diagram of a multimedia editing interface according to an embodiment of the present application.
Alternatively, as shown in FIG. 7, the multimedia editing interface may include an item selection control 710. The controlling the multimedia editing interface to display at least one element selection control may include:
And responding to the selection operation of the item selection control, and controlling the multimedia editing interface to display the object to be edited and at least one element selection control corresponding to the object to be edited.
Fig. 8 is a schematic diagram of a multimedia editing interface according to an embodiment of the present application.
As shown in fig. 8, the object to be edited may be a virtual character, a scene, a sound effect, music, or the like that can be presented in multimedia data. For example, it may be a virtual character in a cutscene, and a different virtual character may have a unique identification (Identity document, ID) bound to the virtual character. When the player performs a selection operation on the item selection control, a plurality of selectable objects to be edited are displayed, and if a certain virtual character is selected, the multimedia editing interface can display the virtual character (i.e. the object to be edited) and at least one element selection control corresponding to the virtual character.
And responding to the selection operation of the element selection control corresponding to the virtual character, and determining the multimedia element corresponding to the virtual character. That is, if the multimedia element in the drama is a motion such as a scene, a movement, an expression, or a language, the motion is executed by the virtual character.
Optionally, the controlling the multimedia editing interface to display at least one element selection control may include:
and controlling the horizontal axis direction of the multimedia editing interface to display a plurality of time frames arranged in time sequence, and displaying at least one corresponding element selection control in the vertical axis direction of each time frame.
With continued reference to fig. 8, for example, a time axis 810 may be displayed on the multimedia editing interface, a plurality of time frames, such as a first frame, a second frame … …, a ninth frame … …, etc., may be sequentially arranged in time order in the horizontal axis direction, and at least one element selection control may be displayed in the vertical axis direction corresponding to each frame.
Fig. 9 is another flow chart of a method for editing multimedia in a game according to an embodiment of the present application.
Optionally, as shown in fig. 9, the determining, in response to the selection operation of the element selection control, the multimedia element corresponding to the element selection control may include:
and S901, responding to the selection operation of the element selection control, and displaying an element submenu corresponding to the element selection control.
The element submenu includes: at least one element selection.
As described above, when the player performs the selection operation on the element selection control, the element submenu corresponding to the element selection control may be presented to the player.
Taking a game interface as an example, fig. 10 shows a schematic diagram of an element submenu provided in an embodiment of the present application, if a user clicks a third element selection control from left to right in a first row, as shown in fig. 10, the element submenu corresponding to the third element selection control may be shown in fig. 10. The element submenu may include: a plurality of multimedia elements such as a departure, a movement, a speaking, an expression, etc.
In the embodiment of the application, the element submenus corresponding to different element selection controls can be the same or different. That is, the player may choose to view the same or different element selections from the element submenu by selecting different element selection controls, and the application is not limited in this regard.
S902, responding to selection operation of the element submenu, and acquiring multimedia elements corresponding to the element selection controls.
Optionally, the user may perform a further selection operation on each multimedia element in the element submenu.
Taking the element submenu shown in fig. 10 as an example, the user may select multimedia elements such as a departure, a movement, a speaking, an expression, etc. to be added to a time frame (e.g., a third frame) corresponding to the third element selection control.
Fig. 11 is another schematic diagram of a multimedia editing interface provided by an embodiment of the present application after determining a multimedia element, where after a user adds the multimedia element to the multimedia editing interface in the above manner, the multimedia editing interface may be as shown in fig. 11.
In fig. 11, the dramatic action composed of the multimedia elements corresponding to the first dramatic character is as follows: first and second frames: boarding; third and fourth frames: performing heavy clicking; fifth frame: blank; sixth and seventh frames: and (5) a refund.
It should be noted that mutual exclusion relation can be preconfigured between multimedia elements to avoid conflict in the process of generating multimedia data. Optionally, when the multimedia elements with mutual exclusion relation correspond to the same object to be edited, the multimedia elements cannot be added in the same time frame, for example, the "field" is mutually exclusive to all other multimedia elements, and then after the "field" is added to the first frame of a certain virtual character, the first frame of the virtual character cannot add other multimedia elements; for example, "move" and "click" are mutually exclusive, and "move" cannot be selected when the element selection control corresponding to the third frame and the fourth frame is operated after the third frame and the fourth frame are added with "click". The specific implementation is not limited herein.
Fig. 12 is a schematic flow chart of a method for editing multimedia in a game according to an embodiment of the present application.
Optionally, as shown in fig. 12, the obtaining, in response to the selection operation on the element submenu, the multimedia element corresponding to each element selection control may include:
s1201, in response to a selection operation of the element submenu, a corresponding element editing interface is displayed.
The element submenu may include one or more of the following: a field, a transition, a departure, a movement, a motion, a speaking and an expression. When the element submenu contains any of a field, transition, departure, movement, the element editing interface may include: and moving the coordinate information editing interface.
The mobile coordinate information editing interface may be used to edit the mobile direction, the position information, and the like of the first virtual object. The coordinate points of the mobile coordinate information editing interface may correspond to coordinate points of corresponding positions in the multimedia presentation interface in the game interface. For example, the multimedia presentation interface may be a theatrical table of a theatrical animation, or a particular piece of area in a gaming interface.
Optionally, the editing parameters of the element editing interface may include: and selecting the end position of the virtual character. That is, the current station of the virtual character is fixed, the initial station can be the initial station, and the final position after the last action is completed in the process, so that the player can edit the final position to control the virtual character to move to the edited final position.
For example, the element editing interface includes a coordinate axis through which the player can learn the position corresponding to the multimedia presentation interface, and then the player can determine the end position selection of the virtual character by dragging or clicking the position point in the coordinate axis.
Fig. 13 is a schematic diagram showing a mobile coordinate information editing interface in a game provided by an embodiment of the present application.
In an alternative implementation, as shown in fig. 13, the element editing interface may include m×n virtual selection grids, where each grid corresponds to a coordinate point in a corresponding position in the multimedia presentation interface in the game interface, and m and n are integers greater than 0.
The obtaining the multimedia element corresponding to each element selection control in response to the editing parameter of the element editing interface may include: and responding to the selection operation of the target grid in the element editing interface, and acquiring the end position information of the multimedia element corresponding to the element selection control.
For example, if the entry position of the character in the game is "a" shown in fig. 13 (the first virtual object), and the player slides on the touch interface from the grid where the "a" is located to the target grid where the fourth row and the fourth column are located in the arrow direction shown in fig. 13, the target grid where the end position of the virtual character is the fourth row and the fourth column corresponds to the position in the multimedia presentation interface in the game interface, among the editing parameters of the element editing interface. Optionally, the player may click on the target grid where the fourth row and the fourth column are located, so as to indicate that the end position of the virtual character is that the target grid where the fourth row and the fourth column are located corresponds to the position in the multimedia presentation interface in the game interface.
S1202, responding to editing parameters of an element editing interface, and acquiring multimedia elements corresponding to each element selection control.
The multimedia element here is a multimedia element containing the editing parameters described above.
Alternatively, in other embodiments, the element editing interface may be a coordinate system, a grid coordinate interface, or the like, and is not limited to the m×n virtual selection grids.
Optionally, the element editing interface may further include positional information of other objects to be edited in the same frame, for example, in the element editing interface, the identifiers (such as the head portrait and the name) of the other objects to be edited are displayed in corresponding positions, so that the position of the current virtual character to be edited does not conflict with the positions of the other objects to be edited in the process of editing by the player.
Alternatively, the object to be edited may include: base material and/or virtual roles. The controlling the multimedia editing interface to display at least one element selection control may include:
and controlling the vertical axis direction of the multimedia editing interface to display the object to be edited, and the horizontal axis direction to display a plurality of time frames arranged in time sequence, wherein the vertical axis direction of each time frame displays at least one element selection control corresponding to each object to be edited so as to associate the element selection controls with different time frames.
Taking a game interface as an example, the basic materials can be scenes, sound effects and the like, and the virtual characters are characters capable of performing multimedia performance, such as virtual characters, animals and the like in a game. Correspondingly, the element selection control corresponding to the base material may include one or more of the following: scene selection control, background selection control, music selection control, and sound effect selection control.
When the element selection control is a scene selection control, the corresponding element submenu may include different scenes. When the element selection control is a background selection control, the corresponding element submenu may include a different game background. When the element selection control is a music selection control, the corresponding element submenu may be a music list, which may include a plurality of music. When the element selection control is an effect selection control, the corresponding element submenu may include a variety of different effects, and the like.
Or, in some embodiments, all the element selection controls corresponding to the base material may be the same, and each element selection control may include: when a user selects an element in the element selection control corresponding to the basic material, a plurality of elements such as scenes, backgrounds, music, sound effects and the like can be presented according to the element selected by the user. Taking a scene as an example, if a user selects a scene element, a scene list may be presented for the user, where the scene list may include: multiple types of scene 1, scene 2, scene 3, etc., and the user can specifically select one of the scenes as a scene element to be added under the time frame. Background, music, sound effects, and the like are not described in detail herein.
Fig. 14 is a schematic view of a multimedia editing interface according to an embodiment of the present application.
As shown in fig. 14, on the horizontal axis corresponding to the scene sound effect (i.e., the base material), elements such as music, a scene, a sound effect, and a background may be added through the scene selection control, the background selection control, the music selection control, and the sound effect selection control on a plurality of time frames arranged in time sequence. For example, music and scenes may be added at the first frame, transitions may be added at the second frame, and so on.
In the embodiment of the present application, if a scene or music is added at a certain frame, the scene or music may be always held at a subsequent time frame. When the scene or the music needs to be switched in a certain frame, the scene selection control or the music selection control can be operated again on the time frame of the scene or the music needing to be switched, and the elements such as the scene or the music needing to be selected can be added.
Optionally, after generating the corresponding multimedia data according to the multimedia element and the time frame corresponding to the multimedia element, the method may further include:
and responding to the selection operation of the preview mode selection control, and playing the multimedia data, wherein a playing interface of the multimedia data comprises time frame information.
The preview mode selection control may be set according to the setting manner of the first virtual object described in the foregoing embodiment, for example: the virtual controls may be presented in a graphical user interface or in multimedia editing. In the preview mode, the user can preview the generated multimedia data.
Fig. 15 is a schematic diagram of a game interface in preview mode according to an embodiment of the present application.
As shown in fig. 15, in the preview mode, the time axis may expose only a small portion of the current time information displayed at the bottom or top of the screen, displaying the time frame information of the multimedia data. While more parts can be used for previewed dramas (i.e. multimedia data).
Fig. 16 is a schematic flow chart of a method for editing multimedia in a game according to an embodiment of the present application.
Optionally, as shown in fig. 16, after generating the corresponding multimedia data according to the multimedia element and the time frame corresponding to the multimedia element, the method may further include:
and S1601, playing the multimedia data in response to a selection operation of the frame selection mode selection control.
The playing interface of the multimedia data comprises a time frame selection virtual control.
And S1602, responding to the dragging operation of the virtual control selected by the time frame, and jumping the multimedia data to the corresponding time frame for playing.
The frame selection mode selection control may be set with reference to the setting manner of the preview mode selection control, which is not described herein. In the frame selection mode, the user can preview the generated multimedia data, but at the same time of previewing, the user can drag the time axis to the appointed time and observe the multimedia content at the appointed time.
Fig. 17 is a schematic diagram of a game interface in a frame selection mode according to an embodiment of the present application.
As shown in fig. 17, in the frame selection mode, the time axis may occupy the bottom or top of the screen, and other spaces display drama (multimedia data).
Optionally, in some embodiments, editing of the time axis may be supported in the frame selection mode, for example, by operating an element selection control at the bottom of the screen, or adjusting the order of the multimedia elements on the time axis, or the like, in the manner described in the foregoing embodiments.
Fig. 18 is a schematic flow chart of a method for editing multimedia in a game according to an embodiment of the present application. In this embodiment, the player may choose to view multimedia data edited by other players.
Optionally, as shown in fig. 18, the method may further include:
S1801, in response to a trigger operation for multimedia viewing, displaying a multimedia data playlist.
The multimedia data play list includes: at least one identification of multimedia data, and a corresponding play virtual control.
Wherein the identification of the multimedia data may be the name of the multimedia data (such as dramatic program name), number, etc. The play virtual control may be a virtual control displayed on a multimedia data playlist.
Fig. 19 is a schematic flow chart of a multimedia data playlist according to an embodiment of the present application.
As shown in fig. 19, the identification of the multimedia data may include: "Yugongshan", "Baishe zhuang" etc.. The virtual playing control is correspondingly arranged after the identification of each multimedia data, and the user can play the corresponding multimedia data by clicking the virtual playing control.
Optionally, the multimedia data playlist may further include a ranking of the multimedia data, an author (director) of the multimedia data, a playing time period of the multimedia data, etc., which is not limited herein.
S1802, playing the target multimedia data in response to the selection operation of the playing virtual control of the target multimedia data in the multimedia data play list.
Optionally, the multimedia data playlist may further include: and supporting virtual controls corresponding to the identifications of the multimedia data. The method may further comprise: and updating the support data of the target multimedia data in response to the selection operation of the support virtual control of the target multimedia data in the multimedia data play list.
The support virtual control corresponding to the identifier of each multimedia data may be a praise virtual control, a support virtual control, a comment virtual control, and the like. For example, as shown in fig. 19, each multimedia data identifier may be correspondingly provided with a viewing virtual control, and the user may view favorite multimedia data (drama) by clicking the viewing virtual control. Or when the support virtual control is a praise virtual control or a comment virtual control, the user can also praise support the favorite multimedia data or comment text on a certain multimedia data through the praise virtual control or the comment virtual control, and the specific type of the support virtual control is not limited in the application.
After the user performs the selection operation on the support virtual control corresponding to a certain multimedia data in the multimedia data play list, the support data of the target multimedia data can be updated in response to the selection operation on the support virtual control of the target multimedia data. Further, the multimedia data may be ranked according to the support data, ranking information may be obtained, and the like.
Based on the method for editing multimedia in a game described in the foregoing embodiment, the embodiment of the present application further provides an apparatus for editing multimedia in a game, where a graphical user interface is provided through a terminal device.
Fig. 20 is a schematic structural diagram of a multimedia editing apparatus in a game according to an embodiment of the present application.
As shown in fig. 20, the multimedia editing apparatus in a game may include: an interface providing module 11 for providing a multimedia editing interface on the graphical user interface; the control display module 12 is configured to control the multimedia editing interface to display at least one element selection control, where different element selection controls are associated with different time frames; a response module 13, configured to determine a multimedia element corresponding to the element selection control in response to a selection operation of the element selection control; the generating module 14 is configured to generate corresponding multimedia data according to the multimedia element and a time frame corresponding to the multimedia element.
Optionally, the interface providing module is specifically configured to display a multimedia editing interface through the graphical user interface in response to an editing instruction.
Optionally, the graphical user interface includes a game screen, the game screen includes a first virtual object, and the interface providing module is specifically configured to respond to a triggering operation on the first virtual object.
Optionally, the multimedia editing interface includes an item selection control, and the control display module is specifically configured to control the multimedia editing interface to display an object to be edited and at least one element selection control corresponding to the object to be edited in response to a selection operation of the item selection control.
Optionally, the control display module is specifically configured to control the multimedia editing interface to display a plurality of time frames arranged in time sequence in a horizontal axis direction, and display at least one corresponding element selection control in a vertical axis direction of each time frame.
Fig. 21 is a schematic diagram showing another configuration of a multimedia editing apparatus in a game according to an embodiment of the present application.
Alternatively, as shown in fig. 21, the response module may include: the first response sub-module 131 is configured to respond to a selection operation of the element selection control, and display an element sub-menu corresponding to the element selection control, where the element sub-menu includes: at least one element selection; and the second response sub-module 132 is configured to obtain, in response to a selection operation on the element submenu, a multimedia element corresponding to each element selection control.
Optionally, the second response sub-module is specifically configured to display a corresponding element editing interface in response to a selection operation of the element sub-menu; and responding to the editing parameters of the element editing interface, and acquiring the multimedia elements corresponding to the element selection controls.
Optionally, the element submenu includes one or more of the following: a, a movement, a motion, a speaking and an expression; when the element submenu contains any one of a field, transition, departure, movement, the element editing interface includes: and moving the coordinate information editing interface.
Optionally, the coordinate points of the mobile coordinate information editing interface correspond to the coordinate points of the corresponding positions in the multimedia presentation interface in the game interface; editing parameters of the element editing interface include: and selecting the end position of the virtual character.
Optionally, the element editing interface includes m×n virtual selection lattices, each lattice corresponds to a coordinate point at a corresponding position in the multimedia presentation interface in the game interface, where m and n are integers greater than 0; the second response sub-module is specifically configured to obtain end position information of a multimedia element corresponding to the element selection control in response to a selection operation of the target grid in the element editing interface.
Optionally, the object to be edited includes: basic materials and/or virtual roles; the element selection control corresponding to the basic material comprises one or more of the following: scene selection control, background selection control, music selection control, and sound effect selection control.
Optionally, the control display module is specifically configured to control the multimedia editing interface to display an object to be edited in a vertical axis direction, display a plurality of time frames arranged in time sequence in a horizontal axis direction, and display at least one element selection control corresponding to each object to be edited in the vertical axis direction of each time frame, so as to associate the element selection controls with different time frames.
Fig. 22 is a schematic diagram showing still another configuration of a multimedia editing apparatus in a game according to an embodiment of the present application.
Optionally, as shown in fig. 22, the multimedia editing apparatus in a game may further include: and the preview playing module 15 is used for responding to the selection operation of the preview mode selection control and playing the multimedia data, wherein a playing interface of the multimedia data comprises time frame information.
Fig. 23 is a schematic diagram showing still another configuration of a multimedia editing apparatus in a game according to an embodiment of the present application.
Optionally, as shown in fig. 23, the multimedia editing apparatus in a game may further include: a frame selection playing module 16, configured to respond to a selection operation of a frame selection mode selection control, and play multimedia data, where a playing interface of the multimedia data includes a time frame selection virtual control; and the method is used for responding to the dragging operation of selecting the virtual control for the time frame, and the multimedia data jumps to the corresponding time frame for playing.
Fig. 24 is a schematic view showing still another configuration of a multimedia editing apparatus in a game according to an embodiment of the present application.
Optionally, as shown in fig. 24, the multimedia editing apparatus in a game may further include: a multimedia viewing module 17 for displaying a multimedia data playlist in response to a triggering operation for multimedia viewing, the multimedia data playlist including: at least one identifier of the multimedia data and a corresponding playing virtual control; and the virtual control is used for responding to the selection operation of the playing virtual control of the target multimedia data in the multimedia data play list, and playing the target multimedia data.
Optionally, the multimedia data playlist further comprises: supporting virtual controls corresponding to the identifications of the multimedia data; the multimedia viewing module is further configured to update the support data for the target multimedia data in response to a selection operation of the support virtual control for the target multimedia data in the multimedia data playlist.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described apparatus may refer to corresponding procedures of the method in the foregoing method embodiment, which is not repeated in the present disclosure.
In the several embodiments provided in the present application, it should be understood that the disclosed method and apparatus may be implemented in other manners. The above-described apparatus embodiments are merely illustrative, and the division of the modules is merely a logical function division, and there may be additional divisions when actually implemented, and for example, multiple modules or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some communication interface, indirect coupling or communication connection of devices or modules, electrical, mechanical, or other form.
The modules described as separate components may or may not be physically separate, and components shown as modules may or may not be physical units, may be located in one place, or may be distributed over multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional module in the embodiments of the present application may be integrated in one processing unit, or each module may exist alone physically, or two or more modules may be integrated in one module.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer readable storage medium executable by a processor. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in the form of a software product stored in a storage medium comprising several instructions for enabling a terminal device to perform all or part of the steps of the method according to the various embodiments of the present application.
On the basis of the above embodiment, an embodiment of the present application further provides a game data processing method, where a graphical user interface is provided by a terminal device, where the graphical user interface includes a game screen, and the game screen includes at least a part of a game scene and a virtual object. The method comprises the following steps:
a. a specific scene area in the game scene is determined.
The specific scene area may refer to an area in a game scene where multimedia data may be displayed, for example, after "table", "armed forces", "banquet hall" or the like appear in a certain game scene, the multimedia data may be displayed at these positions.
b. Multimedia data is acquired, wherein the multimedia data comprises virtual object information and performance parameters.
c. And rendering a virtual character corresponding to the virtual object information in the specific scene area according to the performance parameters, and controlling the virtual character to perform.
The virtual character may be a character in a game or a specific character in a specific performance scene, and is not limited herein. It should be emphasized that the performance of the virtual character in this embodiment is performed in the game scene, and the virtual character is also rendered in the game world where the virtual character is controlled by the player, instead of the CG animation produced in the prior art, or the animation played during the switching process of the game scene. The multimedia data may be the multimedia data edited by the player or other players through the foregoing multimedia editing method, and the performance parameter may be an editing parameter input in the editing process, which may be referred to in the foregoing embodiments, which are not limited herein.
In this embodiment, playing the multimedia data processed by the player in the specific scene area during the game process, and controlling the virtual character to perform performance in the game scene according to the multimedia data, for example, performing specific actions or performing specific conversations.
Optionally, the multimedia data includes at least 2 pieces of multimedia sub data, and the multimedia sub data includes data edited and uploaded by the player.
The multimedia sub-data may be edited by the same player or by different players through the method and interface provided by the foregoing method embodiment.
In this embodiment, the multimedia data may be obtained by selecting a preset number of multimedia sub-data according to a certain attribute, which is not limited herein. For example, the multimedia data of the top 15 is acquired with reference to the support data in the foregoing embodiment.
Further, according to the priority order corresponding to the preset condition, virtual roles corresponding to the virtual object information are rendered in the specific scene area according to the performance parameters of at least 2 pieces of multimedia sub-data in sequence, and performance is controlled to be performed with the virtual roles. That is, the multimedia sub-data are sequentially played in a specific scene area of the game interface according to a certain order, for example, the multimedia data in front of the ranking list 15 are sequentially played according to the ranking order of the ranking list.
Another embodiment of the present application further provides a game data processing method, where a graphical user interface is provided by a terminal device, where the graphical user interface includes a game frame, and the game frame includes at least a part of a game scene and a virtual object, and the method includes:
a. A specific scene area in the game scene is determined, the specific scene area containing the virtual objects performing according to performance parameters.
The specific scene area may refer to an area of the game world where multimedia data may be displayed, for example, after "table", "armed forces", "banquet hall" or the like appear in a certain game scene, the multimedia data may be displayed at these positions. Such as one or more virtual characters performing in a particular scene area. It should be emphasized that the performance of the virtual character in this embodiment is performed in the game scene, and the virtual character is also rendered in the game world where the virtual character is controlled by the player, instead of the CG animation produced in the prior art, or the animation played during the switching process of the game scene.
b. And responding to the current position information of the player virtual character, and controlling the viewing angle of the specific scene area to be converted into the angle corresponding to the current position information.
In this embodiment, a player may watch multimedia data in a game scene, and a presentation view angle of the multimedia data in a specific scene area may be adaptively adjusted according to a station position of a virtual character of the player, so that the player may watch the multimedia data better, and user experience is improved.
It should be noted that, the multimedia data may be the multimedia data edited by the player or other players through the foregoing multimedia editing method, and the performance parameters may be editing parameters input in the editing process, which may refer to the foregoing embodiments, and is not limited herein.
For example, after the player avatar arrives near the table of the game scene, multimedia data starts to be shown on the table, and the angle of the table is adaptively adjusted according to the position of the player avatar and the angle of the table, so that the performance on the table can deviate from the viewing angle of the player avatar. Assuming that the player avatar is positioned in the left front of the table, the table is deflected to the left.
Optionally, in response to the current position information of the player avatar, controlling the viewing angle of the specific scene area to switch to the angle corresponding to the current position information, including: and in response to the movement of the virtual character of the player to the position of the target position selection control, controlling the viewing angle of the specific scene area to be converted into the viewing angle corresponding to the target position selection control.
A plurality of position selection controls can be arranged in a preset range of the specific scene area, so that a player can more clearly determine the station of the player virtual character for watching the multimedia data by arriving at or selecting the position selection controls, and further the watching view angle of the specific scene area is controlled to be converted into the view angle corresponding to the target position selection control.
Alternatively, the position selection control may be a specific virtual item, such as a virtual chair, a virtual stool, or the like, and the coordinate position of the virtual item, i.e., the current position information, after the player avatar arrives at the position of the virtual item (e.g., walks to a certain virtual chair to sit). The position selection control may be a virtual frame, a virtual circle, or the like displayed in the game scene, in which the player virtual character arrives in a certain virtual frame, and the coordinate position of the virtual frame is current position information.
Further, after determining the specific scene area in the game scene, controlling the game screen to display the interactive material control of the virtual object, where the interactive material control includes one or more of the following: role dialogue control, scenario control and role control; and responding to the clicking operation of the interactive material control, and switching the corresponding performance scenario.
In this embodiment, in the process of playing the multimedia data, the specific scene area may also display interactive material controls of one or more virtual characters in the scenario along with the progress of the scenario of the multimedia data. For example, when the scenario progresses to the dialogue of a certain virtual character, a character dialogue is displayed, the character dialogue box is a character dialogue control, after clicking the character dialogue control, a player pops up another virtual character to reply the character dialogue control, and the player can continue clicking, and so on, and progress of the scenario is advanced.
Or the scenario progresses to a certain node, a "scenario control" may pop up, and the player selects whether to advance the scenario, such as "enter second scenario", "enter third scenario", etc. The "character control" may be a scenario that progresses to a certain node, and other characters or existing characters are advanced to develop the scenario, for example, "X character refund", "Y character stadium", etc., and embodiments of the present application are not particularly limited herein.
Another embodiment of the present application further provides a game data processing method, where a graphical user interface is provided by a terminal device, where the graphical user interface includes a game frame, and the game frame includes at least a part of a game scene and a virtual object, and the method includes:
a. a specific scene region in the game scene is determined, the specific scene region containing the virtual objects performing according to performance parameters.
The specific scene area may refer to an area of the game world where multimedia data may be displayed, for example, after "table", "armed forces", "banquet hall" or the like appear in a certain game scene, the multimedia data may be displayed at these positions. Such as one or more virtual characters performing in a particular scene area.
b. In response to a selection operation of a particular virtual control, a current display interface is switched from the game scene to a multimedia presentation interface that includes the virtual object performing according to performance parameters.
In this embodiment, the specific virtual control may be a specific position in the game scene, or a specific NPC, or a virtual key, and the current game interface may be switched to the multimedia playing interface through the selection operation of the specific virtual control, so that the player views the multimedia data on the multimedia playing interface.
The multimedia playing interface may be another game interface different from the current game world, the multimedia playing interface may include a multimedia display frame, in which multimedia data is played, and other positions may further include: functional controls such as barrages, comments, supports, progress bars, etc. are entered, and the embodiment is not limited herein.
It should be noted that, the multimedia data may be the multimedia data edited by the player or other players through the foregoing multimedia editing method, and the performance parameters may be editing parameters input in the editing process, which may refer to the foregoing embodiments, and is not limited herein.
Optionally, the multimedia presentation interface includes a multimedia data playlist, where the multimedia data playlist includes: at least one identification of multimedia data, and a corresponding play virtual control.
Accordingly, the target multimedia data is played in response to the selection operation of the playing virtual control of the target multimedia data in the multimedia data play list.
It should be noted that, when the specific virtual control is operated, the multimedia data play list may be popped up on the game interface, or may be switched to the multimedia presentation interface and then displayed, which is not limited herein.
The multimedia data play list may refer to the embodiment shown in fig. 19 and will not be described herein.
The embodiment of the application also provides terminal equipment, which can be a mobile phone, a tablet personal computer, a game machine and the like, and is not limited.
Fig. 25 shows a schematic structural diagram of a terminal device provided in an embodiment of the present application, as shown in fig. 25, where the terminal device may include: a processor 21, a storage medium 22 and a bus (not shown), the storage medium 22 storing machine-readable instructions executable by the processor 21, the processor 21 and the storage medium 22 communicating over the bus when the terminal device is operating, the processor 21 executing the machine-readable instructions to perform a method as described in the foregoing method embodiments. The specific implementation manner and the technical effect are similar, and are not repeated here.
Wherein the storage medium may include: a usb disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk, etc. The processor may include one or more processing cores (e.g., a single core processor (S) or a multi-core processor (S)). By way of example only, the Processor may include a central processing unit (Central Processing Unit, CPU), application specific integrated circuit (Application Specific Integrated Circuit, ASIC), special instruction set Processor (Application Specific Instruction-set Processor, ASIP), graphics processing unit (Graphics Processing Unit, GPU), physical processing unit (Physics Processing Unit, PPU), digital signal Processor (Digital Signal Processor, DSP), field programmable gate array (Field Programmable Gate Array, FPGA), programmable logic device (Programmable Logic Device, PLD), controller, microcontroller unit, reduced instruction set computer (Reduced Instruction Set Computing, RISC), microprocessor, or the like, or any combination thereof.
In addition, the embodiment of the application also provides a storage medium, and a computer program is stored on the storage medium, and when the computer program is run by a processor, the method in the embodiment of the method is executed. The specific implementation manner and technical effect are similar, and are not repeated here.
The above description is only of the preferred embodiments of the present application and is not intended to limit the present application, but various modifications and variations can be made to the present application by those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (9)

1. A game data processing method characterized in that a graphical user interface is provided by a terminal device, the graphical user interface comprising at least part of a game scene and part of a virtual object, the method comprising: determining a specific scene area in the game scene; acquiring multimedia data, wherein the multimedia data are preset data, the multimedia data comprise at least two pieces of multimedia sub-data, and the multimedia data comprise virtual object information and performance parameters; rendering a virtual character corresponding to the virtual object information in the specific scene area according to the performance parameters, and controlling the virtual character to perform, including: acquiring a priority order; rendering a virtual role corresponding to the virtual object information in the specific scene area according to the at least two pieces of multimedia sub-data according to the priority order, and controlling the virtual role to perform;
The method further comprises the steps of:
providing a multimedia editing interface on the graphical user interface; controlling the multimedia editing interface to display at least one element selection control, wherein different element selection controls are associated with different time frames; responding to the selection operation of the element selection control, and determining a multimedia element corresponding to the element selection control; and generating corresponding multimedia data according to the multimedia elements and the time frames corresponding to the multimedia elements.
2. The method of claim 1, wherein the multimedia sub data includes data edited and uploaded by a player corresponding to a current terminal device and a player corresponding to other terminal devices.
3. The method of claim 1, wherein the priority order is a ranking order of a leaderboard.
4. The method of claim 1, wherein the multimedia editing interface comprises an item selection control, the controlling the multimedia editing interface to display at least one element selection control comprising: and responding to the selection operation of the item selection control, and controlling the multimedia editing interface to display an object to be edited and at least one element selection control corresponding to the object to be edited.
5. The method of claim 1, wherein the controlling the multimedia editing interface to display at least one element selection control comprises: and controlling the horizontal axis direction of the multimedia editing interface to display a plurality of time frames arranged in time sequence, and displaying at least one corresponding element selection control in the vertical axis direction of each time frame.
6. The method of claim 1, wherein the determining, in response to the selection operation of the element selection control, a multimedia element corresponding to the element selection control comprises: and responding to the selection operation of the element selection control, displaying an element submenu corresponding to the element selection control, wherein the element submenu comprises: at least one element selection; responding to the selection operation of the element submenu, and displaying a corresponding element editing interface; and responding to the editing parameters of the element editing interface, and acquiring multimedia elements corresponding to the element selection controls, wherein the performance parameters are the editing parameters input in the editing process.
7. The method of claim 6, wherein the element submenu comprises one or more of the following: a field, a transition, a departure, a movement, a motion, a speaking and an expression.
8. A terminal device, comprising: a processor, a storage medium and a bus, the storage medium storing machine-readable instructions executable by the processor, the processor and the storage medium communicating over the bus when the terminal device is running, the processor executing the machine-readable instructions to perform the method of any of claims 1-7.
9. A storage medium having stored thereon a computer program which, when executed by a processor, performs the method of any of claims 1-7.
CN201911114316.2A 2019-10-18 2019-10-18 Method, device, terminal equipment and storage medium for editing multimedia in game Active CN111124402B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911114316.2A CN111124402B (en) 2019-10-18 2019-10-18 Method, device, terminal equipment and storage medium for editing multimedia in game

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910995729.XA CN110737435B (en) 2019-10-18 2019-10-18 Method, device, terminal equipment and storage medium for editing multimedia in game
CN201911114316.2A CN111124402B (en) 2019-10-18 2019-10-18 Method, device, terminal equipment and storage medium for editing multimedia in game

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201910995729.XA Division CN110737435B (en) 2019-10-18 2019-10-18 Method, device, terminal equipment and storage medium for editing multimedia in game

Publications (2)

Publication Number Publication Date
CN111124402A CN111124402A (en) 2020-05-08
CN111124402B true CN111124402B (en) 2023-09-26

Family

ID=69270346

Family Applications (4)

Application Number Title Priority Date Filing Date
CN201910995729.XA Active CN110737435B (en) 2019-10-18 2019-10-18 Method, device, terminal equipment and storage medium for editing multimedia in game
CN201911114289.9A Active CN111124401B (en) 2019-10-18 2019-10-18 Method, device, terminal equipment and storage medium for editing multimedia in game
CN201911115374.7A Active CN111124403B (en) 2019-10-18 2019-10-18 Method, device, terminal equipment and storage medium for editing multimedia in game
CN201911114316.2A Active CN111124402B (en) 2019-10-18 2019-10-18 Method, device, terminal equipment and storage medium for editing multimedia in game

Family Applications Before (3)

Application Number Title Priority Date Filing Date
CN201910995729.XA Active CN110737435B (en) 2019-10-18 2019-10-18 Method, device, terminal equipment and storage medium for editing multimedia in game
CN201911114289.9A Active CN111124401B (en) 2019-10-18 2019-10-18 Method, device, terminal equipment and storage medium for editing multimedia in game
CN201911115374.7A Active CN111124403B (en) 2019-10-18 2019-10-18 Method, device, terminal equipment and storage medium for editing multimedia in game

Country Status (1)

Country Link
CN (4) CN110737435B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111399711A (en) * 2020-03-10 2020-07-10 广州通达汽车电气股份有限公司 Interface editing method, device, equipment and storage medium
CN113694531B (en) * 2020-05-21 2024-01-19 抖音视界有限公司 Game special effect generation method and device, electronic equipment and computer readable medium
CN112044061B (en) * 2020-08-11 2022-05-06 腾讯科技(深圳)有限公司 Game picture processing method and device, electronic equipment and storage medium
CN112073799B (en) * 2020-08-31 2022-07-01 腾讯数码(天津)有限公司 Virtual resource management method and device, computer equipment and readable storage medium
CN112118397B (en) * 2020-09-23 2021-06-22 腾讯科技(深圳)有限公司 Video synthesis method, related device, equipment and storage medium
CN112169314A (en) * 2020-10-20 2021-01-05 网易(杭州)网络有限公司 Method and device for selecting target object in game
CN112256251A (en) * 2020-10-29 2021-01-22 北京冰封互娱科技有限公司 Game data processing method, game data processing device, main body object configuration method, main body object configuration device, and storage medium
CN112843723B (en) * 2021-02-03 2024-01-16 北京字跳网络技术有限公司 Interaction method, interaction device, electronic equipment and storage medium
CN114845171A (en) * 2022-03-21 2022-08-02 维沃移动通信有限公司 Video editing method and device and electronic equipment
CN117499745A (en) * 2023-04-12 2024-02-02 北京优贝卡科技有限公司 Media editing method, device, equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN202929567U (en) * 2012-11-19 2013-05-08 深圳市数虎图像科技有限公司 Virtual character animation performance system
CN108961368A (en) * 2018-06-21 2018-12-07 珠海金山网络游戏科技有限公司 The method and system of real-time live broadcast variety show in three-dimensional animation environment
CN110062271A (en) * 2019-04-28 2019-07-26 腾讯科技(成都)有限公司 Method for changing scenes, device, terminal and storage medium

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6902481B2 (en) * 2001-09-28 2005-06-07 Igt Decoupling of the graphical presentation of a game from the presentation logic
GB2404300A (en) * 2003-07-25 2005-01-26 Autodesk Inc Compositing and temporally editing clips of image data
CN102693091A (en) * 2012-05-22 2012-09-26 深圳市环球数码创意科技有限公司 Method for realizing three dimensional virtual characters and system thereof
US9308453B2 (en) * 2014-01-09 2016-04-12 Square Enix Holdings Co., Ltd. Online game server architecture using shared rendering
CN108355355A (en) * 2018-03-16 2018-08-03 深圳冰川网络股份有限公司 A kind of control method and system of 3D sports class online game
CN109446346A (en) * 2018-09-14 2019-03-08 传线网络科技(上海)有限公司 Multimedia resource edit methods and device
CN109513212B (en) * 2018-11-19 2020-06-12 苏州好玩友网络科技有限公司 2D mobile game UI (user interface) and scenario editing method and system
CN109582311A (en) * 2018-11-30 2019-04-05 网易(杭州)网络有限公司 A kind of UI is edited in game method and device, electronic equipment, storage medium
CN109756511B (en) * 2019-02-02 2021-08-31 珠海金山网络游戏科技有限公司 Data processing method and device, computing equipment and storage medium
CN110227267B (en) * 2019-06-28 2023-02-28 百度在线网络技术(北京)有限公司 Voice skill game editing method, device and equipment and readable storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN202929567U (en) * 2012-11-19 2013-05-08 深圳市数虎图像科技有限公司 Virtual character animation performance system
CN108961368A (en) * 2018-06-21 2018-12-07 珠海金山网络游戏科技有限公司 The method and system of real-time live broadcast variety show in three-dimensional animation environment
CN110062271A (en) * 2019-04-28 2019-07-26 腾讯科技(成都)有限公司 Method for changing scenes, device, terminal and storage medium

Also Published As

Publication number Publication date
CN111124403A (en) 2020-05-08
CN111124403B (en) 2023-09-26
CN110737435A (en) 2020-01-31
CN111124401A (en) 2020-05-08
CN110737435B (en) 2024-04-19
CN111124402A (en) 2020-05-08
CN111124401B (en) 2023-09-26

Similar Documents

Publication Publication Date Title
CN111124402B (en) Method, device, terminal equipment and storage medium for editing multimedia in game
CN112334886B (en) Content distribution system, content distribution method, and recording medium
CN106066634B (en) Apparatus and method for selecting object for 3D printing
CN111659115B (en) Virtual role control method and device, computer equipment and storage medium
CN110465097B (en) Character vertical drawing display method and device in game, electronic equipment and storage medium
US9671942B2 (en) Dynamic user interface for inheritance based avatar generation
US20080316227A1 (en) User defined characteristics for inheritance based avatar generation
CN110062271A (en) Method for changing scenes, device, terminal and storage medium
CN111897483A (en) Live broadcast interaction processing method, device, equipment and storage medium
TWI831074B (en) Information processing methods, devices, equipments, computer-readable storage mediums, and computer program products in virtual scene
CN111530088B (en) Method and device for generating real-time expression picture of game role
CN113069759A (en) Scene processing method and device in game and electronic equipment
CN114669059A (en) Method for generating expression of game role
CN114095744A (en) Video live broadcast method and device, electronic equipment and readable storage medium
KR101977893B1 (en) Digital actor managing method for image contents
KR100932675B1 (en) Method of video contents manipulation
CN111530087B (en) Method and device for generating real-time expression package in game
CN112044053A (en) Information processing method, device, equipment and storage medium in virtual scene
CN112150602A (en) Model image rendering method and device, storage medium and electronic equipment
CN115624740A (en) Virtual reality equipment, control method, device and system thereof, and interaction system
CN113426110A (en) Virtual character interaction method and device, computer equipment and storage medium
KR20100096605A (en) Method and system for providing game service by avatar motion editing
CN117138346A (en) Game editing method, game control device and electronic equipment
CN117482531A (en) Method and device for processing motion editing in game, storage medium and electronic equipment
CN117101145A (en) Method and device for generating action in game, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant