CN111124401A - Multimedia editing method and device in game, terminal equipment and storage medium - Google Patents

Multimedia editing method and device in game, terminal equipment and storage medium Download PDF

Info

Publication number
CN111124401A
CN111124401A CN201911114289.9A CN201911114289A CN111124401A CN 111124401 A CN111124401 A CN 111124401A CN 201911114289 A CN201911114289 A CN 201911114289A CN 111124401 A CN111124401 A CN 111124401A
Authority
CN
China
Prior art keywords
multimedia
game
interface
virtual
editing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911114289.9A
Other languages
Chinese (zh)
Other versions
CN111124401B (en
Inventor
罗青
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN201911114289.9A priority Critical patent/CN111124401B/en
Publication of CN111124401A publication Critical patent/CN111124401A/en
Application granted granted Critical
Publication of CN111124401B publication Critical patent/CN111124401B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/30Creation or generation of source code
    • G06F8/38Creation or generation of source code for implementing user interfaces
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a multimedia editing method and device in a game, terminal equipment and a storage medium, and relates to the technical field of software games. The embodiment of the application can provide a multimedia editing interface on a graphical user interface, and control the multimedia editing interface to display at least one element selection control, wherein different element selection controls are associated with different time frames, the multimedia elements corresponding to the element selection controls are determined by responding to the selection operation of the element selection controls, and corresponding multimedia data are generated according to the multimedia elements and the time frames corresponding to the multimedia elements, so that the multimedia elements can be added on the controls corresponding to different time frames through simple selection operation, and further the multimedia data are generated, thereby greatly reducing the operation difficulty of multimedia editing in games, and realizing the multimedia editing on a touch terminal more easily.

Description

Multimedia editing method and device in game, terminal equipment and storage medium
The application is a divisional application of a Chinese patent application with the application number of 201910995729.X, which is filed in 2019, 10, 18 and incorporated by reference in the divisional application.
Technical Field
The application relates to the technical field of software games, in particular to a multimedia editing method and device in a game, terminal equipment and a storage medium.
Background
The cut-scene animation is often applied to various games such as end games, hand games, host games and the like, shows a plot in a form of playing animation videos like a movie to game players, and can provide strong game substitution feeling for the game players, relieve the fatigue of player level and the like. The production of the cut-scene animation is usually realized by professional game workers using professional editing tools and controlling display elements such as camera motion, character walking, character action, character conversation and the like in a time axis, and common game players cannot participate or complete the production of the cut-scene animation.
In the prior art, some online game operators provide a cut scene editing tool for game players in order to satisfy enthusiasm of game players for creating cut scenes and improve game experience of the game players, so that the game players can make the cut scenes in the game through the cut scene editing tool. For example, in an animation editing tool provided by an online game of some Personal Computer (PC) platforms, a timeline editing interface may be provided for a game player, and the game player may edit a cut-scene animation element in a time dimension by controlling switching of a plurality of cameras, parameters of a single camera, sound effects, and the like, on the timeline editing interface with the timeline as a main line.
However, the timeline operating interface of the above-described conventional cutscene editing tool is still complicated, and the difficulty of operation is still high when a game player edits the cutscene.
Disclosure of Invention
The application provides a multimedia editing method, a multimedia editing device, terminal equipment and a storage medium in a game, which can reduce the operation difficulty of a game player when editing a cut scene.
In a first aspect, an embodiment of the present application provides a multimedia editing method in a game, where a graphical user interface is provided by a terminal device, and the method includes:
providing a multimedia editing interface on the graphical user interface;
controlling the multimedia editing interface to display at least one element selection control, wherein different element selection controls are associated with different time frames;
responding to the selection operation of the element selection control, and determining a multimedia element corresponding to the element selection control;
and generating corresponding multimedia data according to the multimedia elements and the time frames corresponding to the multimedia elements.
Optionally, the providing a multimedia editing interface on the graphical user interface includes:
and responding to the editing instruction, and displaying the multimedia editing interface through the graphical user interface.
Optionally, the graphical user interface includes a game screen, the game screen includes a first virtual object, and the responding to the editing instruction includes:
in response to a triggering operation on the first virtual object.
Optionally, the multimedia editing interface includes a project selection control, and the controlling the multimedia editing interface to display at least one element selection control includes:
and responding to the selection operation of the project selection control, controlling the multimedia editing interface to display the object to be edited and at least one element selection control corresponding to the object to be edited.
Optionally, the controlling the multimedia editing interface to display at least one element selection control includes:
and controlling the horizontal axis direction of the multimedia editing interface to display a plurality of time frames which are arranged in a time sequence, and displaying at least one corresponding element selection control in the vertical axis direction of each time frame.
Optionally, the determining, in response to the selection operation on the element selection control, the multimedia element corresponding to the element selection control includes:
responding to the selection operation of the element selection control, and displaying an element submenu corresponding to the element selection control, wherein the element submenu comprises: at least one element selection item;
and responding to the selection operation of the element submenu, and acquiring the multimedia elements corresponding to the element selection controls.
Optionally, the obtaining the multimedia element corresponding to each element selection control in response to the selection operation of the element submenu includes:
responding to the selection operation of the element submenu, and displaying a corresponding element editing interface;
and responding to the editing parameters of the element editing interface, and acquiring the multimedia elements corresponding to the element selection controls.
Optionally, the element submenu includes one or more of: boarding, transferring, leaving, moving, speaking and expression; when the element submenu contains any one of the items of entering, transferring, leaving and moving, the element editing interface comprises: and moving the coordinate information editing interface.
Optionally, the coordinate point of the mobile coordinate information editing interface corresponds to a coordinate point of a corresponding position in a multimedia display interface in the game interface; the editing parameters of the element editing interface comprise: and selecting the end position of the virtual character.
Optionally, the element editing interface includes m × n virtual selection grids, each grid corresponds to a coordinate point at a corresponding position in the multimedia presentation interface in the game interface, where m and n are integers greater than 0; the obtaining of the multimedia element corresponding to each element selection control in response to the editing parameter of the element editing interface includes:
and responding to the selection operation of the target grid in the element editing interface, and acquiring the end position information of the multimedia element corresponding to the element selection control.
Optionally, the object to be edited includes: base material and/or avatars; the element selection control corresponding to the basic material comprises one or more of the following items: scene selection control, background selection control, music selection control, sound effect selection control.
Optionally, the controlling the multimedia editing interface to display at least one element selection control includes:
the method comprises the steps of controlling a longitudinal axis direction of a multimedia editing interface to display an object to be edited, displaying a plurality of time frames arranged according to a time sequence in a transverse axis direction, and displaying at least one element selection control corresponding to each object to be edited in the longitudinal axis direction of each time frame so as to associate the element selection controls with different time frames.
Optionally, after generating the corresponding multimedia data according to the multimedia element and the time frame corresponding to the multimedia element, the method further includes:
and responding to the selection operation of the preview mode selection control, and playing the multimedia data, wherein the playing interface of the multimedia data contains time frame information.
Optionally, after generating the corresponding multimedia data according to the multimedia element and the time frame corresponding to the multimedia element, the method further includes:
responding to the selection operation of the frame selection mode selection control, and playing the multimedia data, wherein a playing interface of the multimedia data comprises a time frame selection virtual control;
responding to the dragging operation of selecting the virtual control for the time frame, and skipping the multimedia data to the corresponding time frame for playing.
Optionally, the method further comprises:
in response to a multimedia viewing trigger operation, displaying a multimedia data playlist, the multimedia data playlist including: at least one multimedia data identifier and a corresponding playing virtual control;
and responding to the selection operation of the playing virtual control of the target multimedia data in the multimedia data playing list, and playing the target multimedia data.
Optionally, the multimedia data playlist further includes: the identification of each multimedia data corresponds to a supporting virtual control; the method further comprises the following steps:
and updating the support data of the target multimedia data in response to the selection operation of the support virtual control of the target multimedia data in the multimedia data play list.
In a second aspect, an embodiment of the present application provides an apparatus for editing multimedia in a game, where a graphical user interface is provided through a terminal device, the apparatus includes: the interface providing module is used for providing a multimedia editing interface on the graphical user interface; the control display module is used for controlling the multimedia editing interface to display at least one element selection control, wherein different element selection controls are associated with different time frames; the response module is used for responding to the selection operation of the element selection control and determining the multimedia elements corresponding to the element selection control; and the generating module is used for generating corresponding multimedia data according to the multimedia elements and the time frames corresponding to the multimedia elements.
Optionally, the interface providing module is specifically configured to, in response to the editing instruction, display a multimedia editing interface through the graphical user interface.
Optionally, the graphical user interface includes a game screen, the game screen includes a first virtual object, and the interface providing module is specifically configured to respond to a trigger operation on the first virtual object.
Optionally, the multimedia editing interface includes a project selection control, and the control display module is specifically configured to, in response to a selection operation on the project selection control, control the multimedia editing interface to display an object to be edited and at least one element selection control corresponding to the object to be edited.
Optionally, the control display module is specifically configured to control a horizontal axis direction of the multimedia editing interface to display a plurality of time frames arranged in a time sequence, and a vertical axis direction of each time frame displays at least one corresponding element selection control.
Optionally, the response module comprises: the first response submodule is used for responding to the selection operation of the element selection control and displaying an element submenu corresponding to the element selection control, and the element submenu comprises: at least one element selection item; and the second response submodule is used for responding to the selection operation of the element submenu and acquiring the multimedia elements corresponding to the element selection controls.
Optionally, the second response submodule is specifically configured to, in response to a selection operation on the element submenu, display a corresponding element editing interface; and responding to the editing parameters of the element editing interface, and acquiring the multimedia elements corresponding to the element selection controls.
Optionally, the element submenu includes one or more of: boarding, transferring, leaving, moving, speaking and expression; when the element submenu contains any one of the items of entering, transferring, leaving and moving, the element editing interface comprises: and moving the coordinate information editing interface.
Optionally, the coordinate point of the mobile coordinate information editing interface corresponds to a coordinate point of a corresponding position in a multimedia display interface in the game interface; the editing parameters of the element editing interface comprise: and selecting the end position of the virtual character.
Optionally, the element editing interface includes m × n virtual selection grids, each grid corresponds to a coordinate point at a corresponding position in the multimedia presentation interface in the game interface, where m and n are integers greater than 0; the second response submodule is specifically configured to respond to a selection operation of a target grid in the element editing interface, and acquire end point position information of a multimedia element corresponding to the element selection control.
Optionally, the object to be edited includes: base material and/or avatars; the element selection control corresponding to the basic material comprises one or more of the following items: scene selection control, background selection control, music selection control, sound effect selection control.
Optionally, the control display module is specifically configured to control a longitudinal axis direction of the multimedia editing interface to display the object to be edited, a transverse axis direction of the multimedia editing interface displays a plurality of time frames arranged in a time sequence, and a longitudinal axis direction of each time frame displays at least one element selection control corresponding to each object to be edited, so as to associate the element selection controls with different time frames.
Optionally, the apparatus further comprises: and the preview playing module is used for responding to the selection operation of the preview mode selection control and playing the multimedia data, wherein the playing interface of the multimedia data contains time frame information.
Optionally, the apparatus further comprises: the frame selection playing module is used for responding to the selection operation of the frame selection mode selection control and playing the multimedia data, wherein a playing interface of the multimedia data comprises a time frame selection virtual control; and the multimedia data is jumped to the corresponding time frame for playing in response to the dragging operation of selecting the virtual control for the time frame.
Optionally, the apparatus further comprises: a multimedia viewing module for responding to the multimedia viewing triggering operation and displaying a multimedia data play list, wherein the multimedia data play list comprises: at least one multimedia data identifier and a corresponding playing virtual control; and the virtual control is used for responding to the selection operation of the playing virtual control of the target multimedia data in the multimedia data playing list and playing the target multimedia data.
Optionally, the multimedia data playlist further includes: the support virtual control corresponding to the identification of each multimedia data; the multimedia viewing module is also used for responding to the selection operation of the supporting virtual control of the target multimedia data in the multimedia data play list and updating the supporting data of the target multimedia data.
The embodiment of the application can provide a multimedia editing interface on a graphical user interface, and control the multimedia editing interface to display at least one element selection control, wherein different element selection controls are associated with different time frames, the multimedia elements corresponding to the element selection controls are determined by responding to the selection operation of the element selection controls, and corresponding multimedia data are generated according to the multimedia elements and the time frames corresponding to the multimedia elements, so that the multimedia elements can be added on the controls corresponding to different time frames through simple selection operation, and further the multimedia data are generated, thereby greatly reducing the operation difficulty of multimedia editing in games, and realizing the multimedia editing on a touch terminal more easily.
In a fourth aspect, an embodiment of the present application provides a game data processing method, where a terminal device provides a graphical user interface, where the graphical user interface includes a game screen, and the game screen includes at least a part of a game scene and a virtual object, and the method includes:
determining a specific scene area in the game scene;
acquiring multimedia data, wherein the multimedia data comprises virtual object information and performance parameters;
rendering a virtual character corresponding to the virtual object information in the specific scene area according to the performance parameters, and controlling the virtual character to perform.
Optionally, the multimedia data includes at least 2 multimedia subdata pieces, and the multimedia subdata pieces include data edited and uploaded by a player.
Optionally, the rendering a virtual character corresponding to the virtual object information in the specific scene area according to the performance parameters and controlling the virtual character to perform with the virtual character includes:
rendering a virtual character corresponding to the virtual object information in the specific scene area according to the performance parameters of at least 2 multimedia subdata in sequence according to the priority sequence corresponding to the preset condition, and controlling the virtual character to perform.
In a fifth aspect, an embodiment of the present application provides a game data processing method, where a terminal device provides a graphical user interface, where the graphical user interface includes a game screen, and the game screen includes at least a part of a game scene and a virtual object, and the method includes:
determining a specific scene area in the game scene, the specific scene area containing the virtual object performing according to performance parameters;
and responding to the current position information of the virtual character of the player, and controlling the watching visual angle of the specific scene area to be converted into the visual angle corresponding to the current position information.
Optionally, the preset range of the specific scene area includes: a plurality of position selection controls;
the controlling, in response to current position information of a player virtual character, a viewing perspective of the specific scene area to be switched to a perspective corresponding to the current position information includes:
and responding to the position of the target position selection control moved by the virtual character of the player, and controlling the watching visual angle of the specific scene area to be converted into the visual angle corresponding to the target position selection control.
Optionally, after determining a specific scene area in the game scene, the method further includes:
displaying interactive material controls for the virtual object, the interactive material controls including one or more of: role dialogue controls, scenario controls, role controls;
and responding to the clicking operation of the interactive material control, and switching the corresponding performance scenario.
In a sixth aspect, an embodiment of the present application provides a game data processing method, where a terminal device provides a graphical user interface, where the graphical user interface includes a game screen, and the game screen includes at least a part of a game scene and a virtual object, and the method includes:
determining a specific scene area in the game scene, the specific scene area containing the virtual object performing according to performance parameters;
in response to a selection operation of a specific virtual control, the current display interface is switched from the game scene to a multimedia presentation interface, and the multimedia presentation interface contains the virtual object which performs according to performance parameters.
Optionally, the multimedia presentation interface includes a multimedia data playlist, and the multimedia data playlist includes: at least one multimedia data identifier and a corresponding playing virtual control;
and responding to the selection operation of the playing virtual control of the target multimedia data in the multimedia data playing list, and playing the target multimedia data.
In a seventh aspect, an embodiment of the present application provides a terminal device, including: a processor, a storage medium and a bus, the storage medium storing machine-readable instructions executable by the processor, the processor and the storage medium communicating via the bus when the terminal device is operating, the processor executing the machine-readable instructions to perform the method according to any one of the first or third to sixth aspects.
In an eighth aspect, an embodiment of the present application provides a storage medium, on which a computer program is stored, where the computer program is executed by a processor to perform the method according to any one of the first, second, third, and sixth aspects.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained from the drawings without inventive effort.
FIG. 1 is a flow chart illustrating a multimedia editing method in a game provided by an embodiment of the present application;
FIG. 2 is a schematic diagram of a multimedia editing interface provided by an embodiment of the present application;
FIG. 3 is another schematic diagram of a multimedia editing interface provided by an embodiment of the present application;
FIG. 4 is a schematic diagram of a multimedia editing interface after determining multimedia elements according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a game screen provided by an embodiment of the present application;
fig. 6 is a schematic diagram illustrating an NPC session provided by an embodiment of the present application;
FIG. 7 is a further diagram illustrating a multimedia editing interface provided by an embodiment of the application;
FIG. 8 is a further diagram of a multimedia editing interface provided by an embodiment of the present application;
FIG. 9 is a schematic flow chart illustrating a multimedia editing method in a game according to an embodiment of the present application;
FIG. 10 is a diagram illustrating an element submenu provided by an embodiment of the present application;
FIG. 11 is another schematic diagram of a multimedia editing interface after determining multimedia elements according to an embodiment of the application;
FIG. 12 is a schematic flow chart illustrating a multimedia editing method in a game according to an embodiment of the present application;
FIG. 13 is a diagram illustrating an in-game mobile coordinate information editing interface provided by an embodiment of the present application;
FIG. 14 is a further diagram illustrating a multimedia editing interface provided by an embodiment of the present application;
FIG. 15 is a schematic diagram of a game interface in preview mode provided by an embodiment of the present application;
FIG. 16 is a schematic flow chart illustrating a multimedia editing method in a game according to an embodiment of the present application;
FIG. 17 is a schematic diagram of a game interface in a frame selection mode provided by an embodiment of the present application;
FIG. 18 is a schematic flow chart illustrating a multimedia editing method in a game according to an embodiment of the present application;
FIG. 19 is a flow chart illustrating a multimedia data playlist provided by an embodiment of the present application;
fig. 20 is a schematic structural diagram illustrating an in-game multimedia editing apparatus according to an embodiment of the present application;
fig. 21 is a schematic structural diagram illustrating an in-game multimedia editing apparatus according to an embodiment of the present application;
fig. 22 is a schematic structural diagram illustrating a multimedia editing apparatus in a game according to an embodiment of the present application;
fig. 23 is a schematic structural diagram illustrating a multimedia editing apparatus in a game according to an embodiment of the present application;
fig. 24 is a schematic structural diagram illustrating a multimedia editing apparatus in a game according to an embodiment of the present application;
fig. 25 shows a schematic structural diagram of a terminal device provided in an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. In the description of the present application, it is noted that the terms "first", "second", "third", and the like are used merely for distinguishing between descriptions and are not intended to indicate or imply relative importance.
The application provides a multimedia editing method in a game, which can be applied to terminal equipment. The terminal device may be a local terminal device or a server, and when the terminal device is a server, in an optional implementation, the game is a cloud game.
In an alternative embodiment, cloud gaming refers to a cloud computing-based gaming mode. In the running mode of the cloud game, the running main body of the game program and the game picture presenting main body are separated, the storage and the running of the game data processing method are completed on a cloud game server, and the cloud game client is used for receiving and sending data and presenting the game picture, for example, the cloud game client can be a display device with a data transmission function close to a user side, such as a mobile terminal, a television, a computer, a palm computer and the like; however, the terminal device performing the game data processing is a cloud game server in the cloud. When a game is played, a player operates the cloud game client to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, data such as game pictures and the like are encoded and compressed, the data are returned to the cloud game client through a network, and finally the data are decoded through the cloud game client and the game pictures are output.
In an alternative embodiment, the terminal device may be a local terminal device. The local terminal device stores a game program and is used for presenting a game screen. The local terminal device is used for interacting with the player through a graphical user interface, namely, a game program is downloaded and installed and operated through an electronic device conventionally. The manner in which the local terminal device provides the graphical user interface to the player may include various manners, for example, the graphical user interface may be rendered and displayed on a display screen of the terminal, or the graphical user interface may be provided to the player in a projection manner (which may be a two-dimensional plane projection or a three-dimensional stereo projection) through an output device (for example, a projection device) of the local terminal device. For example, the local terminal device may include a display screen for presenting a graphical user interface including a game screen and a processor for running the game, generating the graphical user interface, and controlling display of the graphical user interface on the display screen. Or, the local terminal may not include a display screen, but performs two-dimensional planar or three-dimensional stereoscopic display on the graphical user interface in a projection manner in a real space, and receives an operation of the graphical user interface by the player through a sensing device provided by the terminal device or a third-party device, thereby implementing an interaction between the terminal device and the player.
The multimedia editing method in the game provided by the embodiment of the application is applied to the terminal equipment, so that the operation difficulty of a game player in editing the cut scene animation can be reduced.
Fig. 1 shows a flow chart of a multimedia editing method in a game provided by an embodiment of the present application.
As shown in fig. 1, the multimedia editing method in the game may include:
s101, providing a multimedia editing interface on the graphical user interface.
As described above, the graphical user interface may be a game interface, for example, after the terminal device loads a game, the game interface of the game may be provided to the user, where the terminal device may be the aforementioned local terminal device, or the aforementioned cloud game client. A multimedia editing interface may be provided in the graphical user interface.
And S102, controlling the multimedia editing interface to display at least one element selection control.
Wherein different element selection controls are associated with different time frames.
Optionally, at least one element selection control may be displayed in the multimedia editing interface, each element selection control may be associated with a different time frame. For example, if there are element selection control 1, element selection control 2, and element selection control 3, the time frame may be a plurality of frames arranged in time sequence during the multimedia presentation, for example, element selection control 1 may be associated with a first frame, element selection control 2 may be associated with a second frame, and element selection control 3 may be associated with a third frame. Of course, a time frame may be associated with multiple element selection controls, and is not particularly limited herein.
Fig. 2 is a schematic diagram of a multimedia editing interface provided in an embodiment of the present application, and as shown in fig. 2, at least one element selection control 210 may be displayed in the multimedia editing interface, and when there are multiple element selection controls 210, the multiple element selection controls 210 may be associated with different time frames, for example: the time frames corresponding to the respective plurality of element selection controls 210 shown in fig. 2 may be ordered sequentially along the horizontal direction, that is: the first element selection control 210 can be associated with a first frame, the second element selection control 210 can be associated with a second frame, and so on, wherein multimedia data is generated and played in the first frame in the future through the multimedia elements added by the first element selection control 210.
S103, responding to the selection operation of the element selection control, and determining the multimedia element corresponding to the element selection control.
Optionally, when a user (e.g., a game player) performs a selection operation on the element selection control, the multimedia editing interface may present the user with an identifier of a selectable multimedia element corresponding to the element selection control.
Fig. 3 is another schematic diagram of a multimedia editing interface provided in the embodiment of the present application, and as shown in fig. 3, when a player selects a third element selection control in the first row from left to right, the multimedia editing interface may present a multimedia element corresponding to the third element selection control shown in fig. 3, such as: back, move, talk, expression, etc.
Alternatively, the selection operation on the element selection control may be a touch click operation, a sliding operation, a long-press operation, and the like on the touch screen, which is not limited herein.
The player can add the multimedia elements to the time frame corresponding to each element selection control in the multimedia editing interface by selecting the multimedia elements corresponding to the element selection controls presented in the multimedia editing interface. For example, fig. 4 shows a schematic view of a multimedia editing interface after determining multimedia elements provided in an embodiment of the present application, and after a user adds multimedia elements in the multimedia editing interface, the multimedia editing interface may be as shown in fig. 4.
Optionally, in this embodiment of the application, each multimedia element has a corresponding time frame, and the corresponding time frames of different multimedia elements may be the same or different. After the multimedia element corresponding to the element selection control is determined in response to the selection operation of the element selection control, the frame length corresponding to the multimedia element can be obtained according to the determined multimedia element. For different multimedia elements, the corresponding frame length may be a pre-configured fixed frame length, for example, one martial arts action occupies one frame; or the frame length obtained may be calculated according to editing parameters specifically selected by the player, for example, if the player selects and edits a sentence of a conversation of a virtual character, the frame length may be calculated according to the conversation length and a preset speech rate. When the multimedia elements are added to the time frame corresponding to each element selection control in the multimedia editing interface, one or more time frames can be automatically and adaptively occupied according to the frame lengths of different multimedia elements.
For example, if the frame length of the stage is 1 frame and the frame length of the move is 2 frames in the multimedia elements shown in fig. 4, the stage can be automatically adapted to be added to the first frame, the move can be added to the second frame and the third frame, and so on.
And S104, generating corresponding multimedia data according to the multimedia elements and the time frames corresponding to the multimedia elements.
After the steps S101, S102, and S103 are completed, corresponding multimedia data may be generated according to multimedia elements added in the multimedia editing interface and time frames corresponding to the multimedia elements.
Taking the foregoing fig. 4 as an example, the generated multimedia data are: the "entering-moving-speaking (good home of the great family) -exiting" may be performed by a default virtual character in the generated multimedia data, or may be performed by a user selecting to add a virtual character, which is not limited herein.
In this way, according to the embodiment of the application, a multimedia editing interface can be provided on a graphical user interface, and the multimedia editing interface is controlled to display at least one element selection control, wherein different element selection controls are associated with different time frames, a multimedia element corresponding to the element selection control is determined by responding to a selection operation on the element selection control, and corresponding multimedia data is generated according to the multimedia element and the time frame corresponding to the multimedia element, so that the multimedia element can be added to the control corresponding to the different time frames through a simple selection operation, and further the multimedia data is generated, thereby greatly reducing the operation difficulty of multimedia editing in a game, and easily realizing multimedia editing on a touch terminal.
Optionally, the providing a multimedia editing interface on the graphical user interface may include:
and responding to the editing instruction, and displaying the multimedia editing interface through the graphical user interface.
The editing instruction may be a click operation of a specific position, a specific virtual object, or a specific control in the user graphical user interface, or a click operation of a user controlling a specific virtual object in the user graphical user interface to move to a specific position that can trigger the multimedia editing interface.
In one embodiment, the graphical user interface may include a game screen including the first virtual object, and the responding to the editing instruction may include:
in response to a triggering operation on the first virtual object.
For example, the first virtual object may be a specific area in the game, and the triggering operation may be that the player's virtual character moves to some specific area in the game map, and then the multimedia editing interface may be triggered. Or, the first virtual object may also be a virtual control provided for the player in the game interface, and the triggering operation may be that the player clicks the virtual control, so as to open the multimedia editing interface. Still alternatively, the first virtual object may also be a Non-player character (NPC) in the game, and the opening mode of the multimedia editing interface may be provided in the game in the form of an NPC session, that is, the triggering operation may be that the player controls the virtual character to move within a preset distance near the NPC to control the multimedia editing interface; or the player controls the virtual character to move to a preset distance near the NPC, the NPC conversation is displayed on the game interface, and the multimedia editing interface is accessed through selection of the control contained in the NPC conversation.
Fig. 5 shows a schematic diagram of a game screen provided in an embodiment of the present application.
As shown in fig. 5, in the game screen, "the couple 510" may be an NPC character in the game, and "the apprehension 520" may be a virtual character controlled by the player, and the player may trigger an NPC conversation with "the couple 510" by controlling the "apprehension 520" to move to a preset distance from the "couple 510".
Fig. 6 shows a schematic diagram of an NPC session provided in an embodiment of the present application, and as shown in fig. 6, the NPC session may be as follows: 1) starting to play the drama; 2) how drama is ranked; 3) and (5) bulletin. Wherein, 1) starting playing drama, 2) how to play drama, and 3) the adAN _ SNtation are selectable controls, and when a player clicks 'start playing drama', the player can enter the multimedia editing interface; when the player clicks 'how to play drama', the use instruction of the multimedia editing interface can be displayed; when the player clicks the bulletin, the NPC dialogue can be quitted and the original game interface can be returned.
Fig. 7 shows another schematic diagram of a multimedia editing interface provided in an embodiment of the present application.
Optionally, as shown in FIG. 7, the multimedia editing interface may include a project selection control 710. The controlling the multimedia editing interface to display at least one element selection control may include:
and responding to the selection operation of the project selection control, controlling the multimedia editing interface to display the object to be edited and at least one element selection control corresponding to the object to be edited.
Fig. 8 shows another schematic diagram of a multimedia editing interface provided in an embodiment of the present application.
As shown in fig. 8, the object to be edited may be a virtual character, a scene, sound effects, music, etc. that can be presented in the multimedia data. For example, it may be a virtual character in a cutscene, and a different virtual character may have a unique Identification (ID) bound to the virtual character. After a player selects a project selection control, a plurality of selectable objects to be edited are displayed, and if a certain virtual character is selected, the multimedia editing interface can display the virtual character (i.e. the object to be edited) and at least one element selection control corresponding to the virtual character.
And responding to the selection operation of the element selection control corresponding to the virtual role, and determining the multimedia element corresponding to the virtual role. That is, if the multimedia elements in the drama movie are actions such as boarding, moving, expression, and language, the actions are executed by the virtual character.
Optionally, the controlling the multimedia editing interface to display at least one element selection control may include:
and controlling the horizontal axis direction of the multimedia editing interface to display a plurality of time frames which are arranged in a time sequence, and displaying at least one corresponding element selection control in the vertical axis direction of each time frame.
Referring to fig. 8, for example, a time axis 810 may be displayed on the multimedia editing interface, a plurality of time frames may be arranged in sequence in time order in the horizontal axis direction, such as a first frame, a second frame … …, a ninth frame … …, and the like, and at least one element selection control may be displayed in the vertical axis direction corresponding to each frame.
Fig. 9 shows another flow chart of a multimedia editing method in a game provided by the embodiment of the application.
Optionally, as shown in fig. 9, the determining, in response to the selection operation on the element selection control, a multimedia element corresponding to the element selection control may include:
and S901, responding to the selection operation of the element selection control, and displaying an element submenu corresponding to the element selection control.
The element submenu includes: at least one element selection item.
As described above, after the player performs a selection operation on the element selection control, an element submenu corresponding to the element selection control may be presented to the player.
Taking a game interface as an example, fig. 10 shows a schematic diagram of an element submenu provided in the embodiment of the present application, and if a user clicks a third element selection control in a first row from left to right, as shown in fig. 10, an element submenu corresponding to the third element selection control may be as shown in fig. 10. The element submenu may include: a plurality of multimedia elements such as scene receding, moving, speaking, expression and the like.
In the embodiment of the application, the element submenus corresponding to different element selection controls may be the same or different. That is, after the player selects different element selection controls, the viewed element selection items in the element submenu may be the same or different, and the application is not limited herein.
And S902, responding to the selection operation of the element submenu, and acquiring the multimedia elements corresponding to the element selection controls.
Optionally, the user may further select each multimedia element in the element submenu.
Taking the element submenu shown in fig. 10 as an example, the user can select multimedia elements such as backspace, move, speak, and expression to be added to the time frame (e.g., the third frame) corresponding to the third element selection control.
Fig. 11 is another schematic diagram of the multimedia editing interface after determining multimedia elements according to the embodiment of the present application, and after a user adds multimedia elements in the multimedia editing interface in the manner described above, the multimedia editing interface may be as shown in fig. 11.
In fig. 11, the dramatic action composed of the multimedia elements corresponding to the first dramatic character is as follows: first and second frames: boarding a field; third and fourth frames: pounding; and a fifth frame: blank; sixth and seventh frames: and (4) retreating the field.
It should be noted that, mutual exclusion relationships may be preconfigured between multimedia elements, so as to avoid conflicts generated in the multimedia data generation process. Optionally, when multimedia elements having a mutual exclusion relationship correspond to the same object to be edited, the multimedia elements cannot be added in the same time frame, for example, "presence" is mutually exclusive with all other multimedia elements, and then after "presence" is added to a first frame of a certain virtual character, the first frame of the virtual character cannot be added with other multimedia elements; for another example, if the "hard click" and the "move" are mutually exclusive, then the "move" cannot be selected when the element selection controls corresponding to the third frame and the fourth frame are operated after the "hard click" is added to the third frame and the fourth frame. The specific implementation form is not limited herein.
Fig. 12 is a schematic flowchart illustrating a multimedia editing method in a game according to an embodiment of the present application.
Optionally, as shown in fig. 12, the obtaining of the multimedia element corresponding to each element selection control in response to the selection operation of the element submenu may include:
and S1201, responding to the selection operation of the element submenu, and displaying a corresponding element editing interface.
The element submenu may include one or more of the following: boarding, transferring, leaving, moving, speaking and expression. When the element submenu contains any one of a stage, a transition, a departure, and a movement, the element editing interface may include: and moving the coordinate information editing interface.
The moving coordinate information editing interface may be configured to edit a moving direction, position information, and the like of the first virtual object. The coordinate points of the mobile coordinate information editing interface may correspond to coordinate points at corresponding positions in a multimedia presentation interface in the game interface. For example, the multimedia presentation interface may be a theatrical table of drama animation, or a particular piece of area in a game interface.
Optionally, the editing parameters of the element editing interface may include: and selecting the end position of the virtual character. That is, the current position of the virtual character is fixed, and may be an initial position initially, and may be an end position after the last action is completed in the process, so that the player may edit the end position to control the virtual character to move to the edited end position.
For example, the element editing interface includes a coordinate axis through which the player can know a position corresponding to the position in the multimedia presentation interface, and then the player can determine the selection of the end position of the virtual character by dragging or clicking a position point in the coordinate axis.
Fig. 13 is a schematic diagram illustrating a mobile coordinate information editing interface in a game provided in an embodiment of the present application.
Optionally, as shown in fig. 13, the element editing interface may include m × n virtual selection grids, where each grid corresponds to a coordinate point at a corresponding position in the multimedia presentation interface in the game interface, and m and n are integers greater than 0.
The obtaining of the multimedia elements corresponding to each element selection control in response to the editing parameters of the element editing interface may include: and responding to the selection operation of the target grid in the element editing interface, and acquiring the end position information of the multimedia element corresponding to the element selection control.
For example, if the entry position of the character in the game is "a" (first virtual object) shown in fig. 13, and the player slides on the touch interface from the grid where the character "a" is located to the target grid where the fourth row and the fourth column are located in the direction of the arrow shown in fig. 13, the target grid where the end position of the virtual character is the fourth row and the fourth column in the editing parameters of the element editing interface corresponds to the position in the multimedia presentation interface in the game interface. Optionally, the player may click on the target grid in which the fourth row and the fourth column are located, so as to indicate that the end position of the virtual character is that the target grid in which the fourth row and the fourth column are located corresponds to the position in the multimedia presentation interface in the game interface.
And S1202, responding to the editing parameters of the element editing interface, and acquiring the multimedia elements corresponding to the element selection controls.
The multimedia element here is a multimedia element that includes the above editing parameter.
Optionally, in other embodiments, the element editing interface may also be a coordinate system, a grid coordinate interface, and the like, and is not limited to the m × n virtual selection grids, and the application is not limited herein.
Optionally, the element editing interface may further include position information of other objects to be edited in the same frame, for example, in the element editing interface, identifiers (head portrait, name, and the like) of the other objects to be edited are displayed at corresponding positions, so that the position of the current virtual character to be edited does not conflict with the other objects to be edited in the editing process of the player.
Optionally, the object to be edited may include: base material and/or avatars. The controlling the multimedia editing interface to display at least one element selection control may include:
the method comprises the steps of controlling a longitudinal axis direction of a multimedia editing interface to display an object to be edited, displaying a plurality of time frames arranged according to a time sequence in a transverse axis direction, and displaying at least one element selection control corresponding to each object to be edited in the longitudinal axis direction of each time frame so as to associate the element selection controls with different time frames.
Taking a game interface as an example, the basic material may be a scene, a sound effect, and the like, and the virtual character is a character capable of performing multimedia performance, such as a virtual character, an animal, and the like in a game. Correspondingly, the element selection control corresponding to the base material can comprise one or more of the following items: scene selection control, background selection control, music selection control, sound effect selection control.
When the element selection control is a scene selection control, the corresponding element submenu may include different scenes. When an element selection control is a background selection control, the corresponding element submenu may include different game backgrounds. When the element selection control is a music selection control, the corresponding element submenu may be a music list, and the music list may include a plurality of pieces of music. When the element selection control is a sound effect selection control, the corresponding element submenu may include a plurality of different sound effects, and the like.
Or, in some embodiments, all element selection controls corresponding to the base material may also be the same, and each element selection control may include: when a user selects a certain element in the element selection control corresponding to the basic material, a plurality of different types corresponding to the element can be presented according to the element selected by the user. Taking a scene as an example, if a user selects a scene element, a scene list may be presented to the user, where the scene list may include: the user can specifically select one of the scenes as a scene element to be added to the time frame. Background, music, sound effects, etc. are similar and will not be described herein.
Fig. 14 shows another schematic diagram of a multimedia editing interface provided in an embodiment of the present application.
As shown in fig. 14, on the horizontal axis corresponding to the scene sound effect (i.e., the basic material), elements such as music, scene, sound effect, and background may be added through the scene selection control, the background selection control, the music selection control, and the sound effect selection control in a plurality of time frames arranged in time sequence. For example, music and scenes may be added at the first frame, transitions may be added at the second frame, and so on.
In the embodiment of the present application, if a scene or music is added to a certain frame, the scene or music may be kept in the following time frame. When a scene or music needs to be switched in a certain frame, the scene selection control or the music selection control can be operated again in the time frame in which the scene or music needs to be switched, and elements such as the required scene or music can be selected and added.
Optionally, after generating the corresponding multimedia data according to the multimedia element and the time frame corresponding to the multimedia element, the method may further include:
and responding to the selection operation of the preview mode selection control, and playing the multimedia data, wherein the playing interface of the multimedia data contains time frame information.
The preview mode selection control may be set according to the setting mode of the first virtual object described in the foregoing embodiment, for example: the virtual controls may be presented in a graphical user interface or multimedia editing. In the preview mode, the user can preview the generated multimedia data.
Fig. 15 is a schematic diagram illustrating a game interface in a preview mode according to an embodiment of the present application.
As shown in fig. 15, in the preview mode, the time axis may show only a small portion of the current time information exposed at the bottom or top of the screen, displaying the time frame information of the multimedia data. While more may be used for previewing dramas (i.e., multimedia data).
Fig. 16 is a schematic flow chart illustrating a multimedia editing method in a game provided by an embodiment of the present application.
Optionally, as shown in fig. 16, after generating the corresponding multimedia data according to the multimedia element and the time frame corresponding to the multimedia element, the method may further include:
s1601, responding to the selection operation of the frame selection mode selection control, and playing the multimedia data.
The playing interface of the multimedia data comprises a time frame selection virtual control.
S1602, responding to the dragging operation of the time frame selection virtual control, and skipping the multimedia data to the corresponding time frame for playing.
The frame selection mode selection control can be set by referring to the setting mode of the preview mode selection control, which is not described herein again. In the frame selection mode, the user can also preview the generated multimedia data, but at the same time of previewing, the user can also drag the time axis to the appointed time to observe the multimedia content at the appointed time.
Fig. 17 is a schematic diagram illustrating a game interface in a frame selection mode according to an embodiment of the present application.
As shown in fig. 17, in the frame selection mode, the time axis may occupy the bottom or top of the screen, and other spaces display dramas (multimedia data).
Optionally, in some embodiments, editing of the time axis may also be supported in the frame selection mode, for example, in a manner described in the foregoing embodiments, the element selection control may be operated at the bottom of the screen, or the order of the multimedia elements on the time axis may be adjusted.
Fig. 18 is a schematic flowchart illustrating a multimedia editing method in a game according to an embodiment of the present application. In this embodiment, the player may also select to view multimedia data edited by other players.
Optionally, as shown in fig. 18, the method may further include:
s1801, in response to the multimedia viewing triggering operation, displaying a multimedia data playlist.
The multimedia data playlist includes: the identification of at least one multimedia data and the corresponding playing virtual control.
The identification of the multimedia data may be a name (e.g., name of drama program), number, etc. of the multimedia data. The play virtual control may be a virtual control displayed on a multimedia data playlist.
Fig. 19 is a flowchart illustrating a multimedia data playlist provided in an embodiment of the present application.
As shown in fig. 19, the identification of the multimedia data may include: "Yugong Shi shan", "Bai she Chuan" and so on. And a playing virtual control is correspondingly arranged behind the identifier of each multimedia data, and a user can play the corresponding multimedia data by clicking the playing virtual control.
Optionally, the multimedia data playlist may further include a ranking of the multimedia data, an author (director) of the multimedia data, a playing time of the multimedia data, and the like, which is not limited herein.
And S1802, responding to the selection operation of the playing virtual control of the target multimedia data in the multimedia data playing list, and playing the target multimedia data.
Optionally, the multimedia data playlist may further include: and the identification of each multimedia data corresponds to a supporting virtual control. The method may further comprise: and updating the support data of the target multimedia data in response to the selection operation of the support virtual control of the target multimedia data in the multimedia data play list.
The supporting virtual control corresponding to the identifier of each multimedia data may be a praise virtual control, a supporting virtual control, a comment virtual control, and the like. For example, as shown in fig. 19, a reward virtual control may be correspondingly provided after each identifier of the multimedia data, and the user may reward favorite multimedia data (drama) by clicking the reward virtual control. Or when the supported virtual control is a praise virtual control or a comment virtual control, the user may also praise favorite multimedia data or comment a certain multimedia data by the praise virtual control or the comment virtual control, and the like.
After a user selects a supporting virtual control corresponding to a certain multimedia data in the multimedia data play list, the supporting data of the target multimedia data can be updated in response to the selection operation of the supporting virtual control of the target multimedia data. Further, the multimedia data may be ranked according to the support data, ranking information may be obtained, and the like.
Based on the multimedia editing method in the game described in the foregoing embodiment, the embodiment of the present application further provides a multimedia editing apparatus in the game, which provides a graphical user interface through a terminal device.
Fig. 20 is a schematic structural diagram illustrating an in-game multimedia editing apparatus according to an embodiment of the present application.
As shown in fig. 20, the in-game multimedia editing apparatus may include: the interface providing module 11 is used for providing a multimedia editing interface on the graphical user interface; the control display module 12 is configured to control the multimedia editing interface to display at least one element selection control, where different element selection controls are associated with different time frames; the response module 13 is configured to determine, in response to a selection operation on the element selection control, a multimedia element corresponding to the element selection control; a generating module 14, configured to generate corresponding multimedia data according to the multimedia element and the time frame corresponding to the multimedia element.
Optionally, the interface providing module is specifically configured to, in response to the editing instruction, display a multimedia editing interface through the graphical user interface.
Optionally, the graphical user interface includes a game screen, the game screen includes a first virtual object, and the interface providing module is specifically configured to respond to a trigger operation on the first virtual object.
Optionally, the multimedia editing interface includes a project selection control, and the control display module is specifically configured to, in response to a selection operation on the project selection control, control the multimedia editing interface to display an object to be edited and at least one element selection control corresponding to the object to be edited.
Optionally, the control display module is specifically configured to control a horizontal axis direction of the multimedia editing interface to display a plurality of time frames arranged in a time sequence, and a vertical axis direction of each time frame displays at least one corresponding element selection control.
Fig. 21 is a schematic diagram illustrating another structure of an in-game multimedia editing apparatus according to an embodiment of the present application.
Alternatively, as shown in fig. 21, the response module may include: the first response sub-module 131 is configured to, in response to a selection operation on an element selection control, display an element sub-menu corresponding to the element selection control, where the element sub-menu includes: at least one element selection item; the second response submodule 132 is configured to, in response to a selection operation on the element submenu, obtain a multimedia element corresponding to each element selection control.
Optionally, the second response submodule is specifically configured to, in response to a selection operation on the element submenu, display a corresponding element editing interface; and responding to the editing parameters of the element editing interface, and acquiring the multimedia elements corresponding to the element selection controls.
Optionally, the element submenu includes one or more of: boarding, transferring, leaving, moving, speaking and expression; when the element submenu contains any one of the items of entering, transferring, leaving and moving, the element editing interface comprises: and moving the coordinate information editing interface.
Optionally, the coordinate point of the mobile coordinate information editing interface corresponds to a coordinate point of a corresponding position in a multimedia display interface in the game interface; the editing parameters of the element editing interface comprise: and selecting the end position of the virtual character.
Optionally, the element editing interface includes m × n virtual selection grids, each grid corresponds to a coordinate point at a corresponding position in the multimedia presentation interface in the game interface, where m and n are integers greater than 0; the second response submodule is specifically configured to respond to a selection operation of a target grid in the element editing interface, and acquire end point position information of a multimedia element corresponding to the element selection control.
Optionally, the object to be edited includes: base material and/or avatars; the element selection control corresponding to the basic material comprises one or more of the following items: scene selection control, background selection control, music selection control, sound effect selection control.
Optionally, the control display module is specifically configured to control a longitudinal axis direction of the multimedia editing interface to display the object to be edited, a transverse axis direction of the multimedia editing interface displays a plurality of time frames arranged in a time sequence, and a longitudinal axis direction of each time frame displays at least one element selection control corresponding to each object to be edited, so as to associate the element selection controls with different time frames.
Fig. 22 is a schematic structural diagram illustrating a multimedia editing apparatus in a game according to an embodiment of the present application.
Optionally, as shown in fig. 22, the in-game multimedia editing apparatus may further include: and the preview playing module 15 is configured to play the multimedia data in response to a selection operation of the preview mode selection control, where a playing interface of the multimedia data includes time frame information.
Fig. 23 is a schematic structural diagram illustrating a multimedia editing apparatus in a game according to an embodiment of the present application.
Optionally, as shown in fig. 23, the multimedia editing apparatus in the game may further include: the frame selection playing module 16 is configured to play multimedia data in response to a selection operation on a frame selection mode selection control, where a playing interface of the multimedia data includes a time frame selection virtual control; and the multimedia data is jumped to the corresponding time frame for playing in response to the dragging operation of selecting the virtual control for the time frame.
Fig. 24 is a schematic structural diagram illustrating a multimedia editing apparatus in a game according to an embodiment of the present application.
Optionally, as shown in fig. 24, the in-game multimedia editing apparatus may further include: a multimedia viewing module 17, configured to respond to a multimedia viewing trigger operation to display a multimedia data playlist, where the multimedia data playlist includes: at least one multimedia data identifier and a corresponding playing virtual control; and the virtual control is used for responding to the selection operation of the playing virtual control of the target multimedia data in the multimedia data playing list and playing the target multimedia data.
Optionally, the multimedia data playlist further includes: the support virtual control corresponding to the identification of each multimedia data; the multimedia viewing module is also used for responding to the selection operation of the supporting virtual control of the target multimedia data in the multimedia data play list and updating the supporting data of the target multimedia data.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working process of the apparatus described above may refer to the corresponding process of the method in the foregoing method embodiment, and is not described in detail in this application.
In the embodiments provided in the present application, it should be understood that the disclosed method and apparatus can be implemented in other ways. The above-described apparatus embodiments are merely illustrative, and for example, the division of the modules is merely a logical division, and there may be other divisions in actual implementation, and for example, a plurality of modules or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or modules through some communication interfaces, and may be in an electrical, mechanical or other form.
The modules described as separate parts may or may not be physically separate, and parts displayed as modules may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional modules in the embodiments of the present application may be integrated into one processing unit, or each module may exist alone physically, or two or more modules are integrated into one module.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer-readable storage medium executable by a processor. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for enabling a terminal device to perform all or part of the steps of the method according to the embodiments of the present application.
On the basis of the foregoing embodiment, an embodiment of the present application further provides a game data processing method, where a terminal device provides a graphical user interface, where the graphical user interface includes a game screen, and the game screen includes at least a part of a game scene and a virtual object. The method comprises the following steps:
a. a particular scene area in a game scene is determined.
The specific scene area may refer to an area in the game scene where the multimedia data may be displayed, for example, after "table," "martial arts," "banquet hall," and the like appear in a certain game scene, the multimedia data may be displayed at these positions.
b. Multimedia data is acquired, wherein the multimedia data comprises virtual object information and performance parameters.
c. Rendering a virtual character corresponding to the virtual object information in the specific scene area according to the performance parameters, and controlling the virtual character to perform.
The virtual character may be a character in a game or a specific character in a specific performance scene, and is not limited herein. It should be noted that the performance of the virtual character in the embodiment is performed in a game scene, and the virtual character is also rendered in the game world where the player controls the player virtual character, instead of the CG animation created in the prior art or the animation played during the game scene switching process. The multimedia data may be multimedia data edited by the aforementioned multimedia editing method by the player or other players, and the performance parameter may be an editing parameter input in the editing process, and the specific editing process may refer to the foregoing embodiments, which is not limited herein.
In this embodiment, during the game, the multimedia data processed by the player can be played in the specific scene area, and the virtual character is controlled to perform in the game scene according to the multimedia data, for example, to perform a specific action or perform a specific conversation.
Optionally, the multimedia data includes at least 2 multimedia subdata pieces, where the multimedia subdata pieces include data edited and uploaded by a player.
The multimedia subdata can be edited by the same player or different players through the method and the interface provided by the method embodiment.
In this embodiment, the multimedia data may be multimedia sub-data with a preset number selected according to a certain attribute, which is not limited herein. The multimedia data of the leader board top 15 is acquired, for example, with reference to the support data in the foregoing embodiment.
Further, rendering a virtual character corresponding to the virtual object information in the specific scene area according to the performance parameters of at least 2 multimedia subdata in sequence according to a priority sequence corresponding to a preset condition, and controlling the virtual character to perform. That is, the multimedia sub-data are sequentially played in a specific scene area of the game interface according to a certain sequence, for example, the multimedia data in the top 15 of the ranking list are sequentially played according to the ranking sequence of the ranking list.
Another embodiment of the present application further provides a game data processing method, where a terminal device provides a graphical user interface, where the graphical user interface includes a game screen, and the game screen includes at least a part of a game scene and a virtual object, the method includes:
a. determining a specific scene area in the game scene, wherein the specific scene area contains the virtual object performing according to performance parameters.
The specific scene area may refer to an area in the game world where multimedia data may be presented, for example, after "table," "martial arts," "banquet hall," and the like appear in a certain game scene, the multimedia data may be presented at these positions. For example, one or more virtual characters perform in a particular scene area. It should be noted that the performance of the virtual character in the embodiment is performed in a game scene, and the virtual character is also rendered in the game world where the player controls the player virtual character, instead of the CG animation created in the prior art or the animation played during the game scene switching process.
b. And responding to the current position information of the virtual character of the player, and controlling the watching visual angle of the specific scene area to be converted into the visual angle corresponding to the current position information.
In this embodiment, a player can view multimedia data in a game scene, and a display view angle of the multimedia data in a specific scene area can be adaptively adjusted according to a station position of a virtual character of the player, so that the player can better view the multimedia data, and user experience is improved.
It should be noted that the multimedia data may be multimedia data edited by the aforementioned multimedia editing method before the player or other players, the performance parameter may be an editing parameter input in the editing process, and the specific editing process may refer to the foregoing embodiments, which is not limited herein.
For example, after the player virtual character arrives near the table of the game scene, the multimedia data is displayed on the table, and the angle of the table is adaptively adjusted according to the position of the player virtual character and the angle of the table, so that the performance on the table can be biased to the viewing angle of the player virtual character. Assuming the player avatar is positioned to the front left of the table, the table is deflected to the left.
Optionally, in response to the current position information of the player virtual character, controlling the viewing perspective of the specific scene area to be switched to a perspective corresponding to the current position information includes: and responding to the position of the target position selection control moved by the virtual character of the player, and controlling the watching visual angle of the specific scene area to be converted into the visual angle corresponding to the target position selection control.
A plurality of position selection controls can be arranged in a preset range of a specific scene area, so that a player can more clearly determine the station of a virtual character of the player for watching the multimedia data by arriving at or selecting the position selection controls, and further the watching visual angle of the specific scene area is controlled to be converted into the visual angle corresponding to the target position selection control.
Alternatively, the position selection control may be a specific virtual article, such as a virtual chair, a virtual stool, etc., and after the player virtual character arrives at the position of the virtual article (for example, walks to a certain virtual chair to sit down), the coordinate position of the virtual article is the current position information. The position selection control may also be a virtual frame, a virtual circle, or the like displayed in a game scene, and when the player virtual character arrives in a certain virtual frame, the coordinate position of the virtual frame is the current position information.
Further, after the specific scene area in the game scene is determined, the game picture is controlled to display an interactive material control of the virtual object, where the interactive material control includes one or more of the following: role dialogue controls, scenario controls, role controls; and responding to the clicking operation of the interactive material control, and switching the corresponding performance scenario.
In this embodiment, in the process of playing the multimedia data, the specific scene area may also display the interactive material control of one or more virtual characters in the scenario along with the progress of the scenario of the multimedia data. For example, when a scenario progresses to a conversation of a certain virtual character, a character conversation is displayed, a character conversation box is a character conversation control, a player clicks the character conversation control and then pops up a character conversation control replied by another virtual character, the player can continue to click, and so on, so that the scenario development is promoted.
Or the scenario progresses to a certain node, a "scenario control" may be popped up, and the player chooses whether to advance the scenario, for example, "enter the second screen", "enter the third screen", and so on. The "role control" may be a scenario that progresses to a certain node, and advances other roles or existing roles to develop the scenario, for example, "X role exits", "Y role appears", and the like, and the embodiment of the present application is not limited in particular here.
Another embodiment of the present application further provides a game data processing method, where a terminal device provides a graphical user interface, where the graphical user interface includes a game screen, and the game screen includes at least a part of a game scene and a virtual object, the method includes:
a. determining a particular scene area in the game scene, the particular scene area containing the virtual object performing according to performance parameters.
The specific scene area may refer to an area in the game world where multimedia data may be presented, for example, after "table," "martial arts," "banquet hall," and the like appear in a certain game scene, the multimedia data may be presented at these positions. For example, one or more virtual characters perform in a particular scene area.
b. In response to a selection operation of a specific virtual control, the current display interface is switched from the game scene to a multimedia presentation interface, and the multimedia presentation interface contains the virtual object which performs according to performance parameters.
In this embodiment, the specific virtual control may be a specific position in a game scene, a specific NPC, a virtual key, or the like, and by selecting and operating the specific virtual control, the current game interface is switched to a multimedia playing interface, and a player views multimedia data on the multimedia playing interface.
The multimedia playing interface may be another game interface different from the current game world, the multimedia playing interface may include a multimedia display box, the multimedia data is played in the multimedia display box, and other positions may further include: and inputting functional controls such as a barrage, a comment, a support, a progress bar and the like, which is not limited herein.
It should be noted that the multimedia data may be multimedia data edited by the aforementioned multimedia editing method before the player or other players, the performance parameter may be an editing parameter input in the editing process, and the specific editing process may refer to the foregoing embodiments, which is not limited herein.
Optionally, the multimedia presentation interface includes a multimedia data playlist, and the multimedia data playlist includes: the identification of at least one multimedia data and the corresponding playing virtual control.
Correspondingly, responding to the selection operation of the playing virtual control of the target multimedia data in the multimedia data playing list, and playing the target multimedia data.
It should be noted that, after the specific virtual control is operated, the multimedia data playlist may be popped up on the game interface, or the multimedia data playlist may be displayed after the multimedia display interface is switched to, which is not limited herein.
The multimedia data playlist can refer to the embodiment shown in fig. 19, and will not be described herein.
The embodiment of the application also provides a terminal device, which can be a mobile phone, a tablet computer, a game machine and the like, and the application is not limited.
Fig. 25 shows a schematic structural diagram of a terminal device provided in an embodiment of the present application, and as shown in fig. 25, the terminal device may include: a processor 21, a storage medium 22 and a bus (not shown), wherein the storage medium 22 stores machine-readable instructions executable by the processor 21, and when the terminal device is operated, the processor 21 communicates with the storage medium 22 via the bus, and the processor 21 executes the machine-readable instructions to perform the method as described in the foregoing method embodiments. The specific implementation and technical effects are similar, and are not described herein again.
Among others, the storage medium may include: various media capable of storing program codes, such as a U disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk. A processor may include one or more processing cores (e.g., a single-core processor (S) or a multi-core processor (S)). Merely by way of example, a processor may include a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), an application specific instruction set processor (ASIP), a Graphics Processing Unit (GPU), a Physical Processing Unit (PPU), a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), a Programmable Logic Device (PLD), a controller, a microcontroller unit, a Reduced instruction set computer (Reduced instruction set computing, RISC), a microprocessor, or the like, or any combination thereof.
In addition, the present application further provides a storage medium, where a computer program is stored on the storage medium, and when the computer program is executed by a processor, the computer program performs the method described in the foregoing method embodiments. The specific implementation and technical effects are similar, and are not described herein again.
The above description is only a preferred embodiment of the present application and is not intended to limit the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (11)

1. A game data processing method is characterized in that a terminal device provides a graphical user interface, the graphical user interface comprises a game picture, the game picture comprises at least part of game scenes and virtual objects, and the method comprises the following steps:
determining a specific scene area in the game scene, wherein the specific scene area contains the virtual object performing according to performance parameters;
and responding to the selection operation of a specific virtual control, and controlling the current display interface to be switched from the game scene to a multimedia display interface, wherein the multimedia display interface contains the virtual object which performs according to the performance parameters.
2. The method of claim 1, wherein the specific scene area containing the virtual object performing according to performance parameters can be performed by:
acquiring multimedia data, wherein the multimedia data comprises virtual object information and performance parameters;
rendering a virtual character corresponding to the virtual object information in the specific scene area according to the performance parameters, and controlling the virtual character to perform.
3. The method of claim 1, wherein the step of controlling the currently displayed interface to switch from the game scene to a multimedia presentation interface containing the virtual object performing according to performance parameters comprises: and controlling the current display interface to be switched to a multimedia playing interface, wherein the multimedia playing interface comprises a multimedia display frame used for playing multimedia data, and the multimedia playing interface is another game picture different from the game world corresponding to the current display interface.
4. The method of claim 3, wherein the multimedia presentation interface comprises a multimedia data playlist, the multimedia data playlist comprising: at least one multimedia data identifier and a corresponding playing virtual control; the step of said multimedia presentation interface containing said virtual objects to be performed according to performance parameters, comprising: and responding to the selection operation of a playing virtual control of the target multimedia data in the multimedia data playing list, and playing the target multimedia data in the multimedia display frame.
5. The method according to any one of claims 2-4, wherein the multimedia data comprises data edited and uploaded by a player corresponding to the current terminal device and a player corresponding to the other terminal device.
6. The method of claim 1, wherein the multimedia data is preset data, wherein the presetting of the multimedia data can be performed by:
providing a multimedia editing interface on the graphical user interface;
controlling the multimedia editing interface to display at least one element selection control, wherein different element selection controls are associated with different time frames;
responding to the selection operation of the element selection control, and determining a multimedia element corresponding to the element selection control;
and generating corresponding multimedia data according to the multimedia elements and the time frames corresponding to the multimedia elements.
7. The method of claim 6, wherein the determining the multimedia element corresponding to the element selection control in response to the selection operation of the element selection control comprises:
responding to the selection operation of the element selection control, and displaying an element submenu corresponding to the element selection control, wherein the element submenu comprises: at least one element selection item;
responding to the selection operation of the element submenu, and displaying a corresponding element editing interface;
and responding to the editing parameters of the element editing interface, and acquiring the multimedia elements corresponding to the element selection controls, wherein the performance parameters are the editing parameters input in the editing process.
8. The method of claim 1, further comprising:
and responding to the current position information of the virtual character of the player, and controlling the watching visual angle of the specific scene area to be converted into the visual angle corresponding to the current position information.
9. The method of claim 1, wherein the particular virtual control is one or more of: a particular location in the game scene, a particular NPC in the game scene, and a virtual key.
10. A terminal device, comprising: a processor, a storage medium and a bus, the storage medium storing machine-readable instructions executable by the processor, the processor and the storage medium communicating over the bus when the terminal device is operating, the processor executing the machine-readable instructions to perform the method of any one of claims 1-9.
11. A storage medium, characterized in that the storage medium has stored thereon a computer program which, when being executed by a processor, performs the method according to any one of claims 1-9.
CN201911114289.9A 2019-10-18 2019-10-18 Method, device, terminal equipment and storage medium for editing multimedia in game Active CN111124401B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911114289.9A CN111124401B (en) 2019-10-18 2019-10-18 Method, device, terminal equipment and storage medium for editing multimedia in game

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910995729.XA CN110737435B (en) 2019-10-18 2019-10-18 Method, device, terminal equipment and storage medium for editing multimedia in game
CN201911114289.9A CN111124401B (en) 2019-10-18 2019-10-18 Method, device, terminal equipment and storage medium for editing multimedia in game

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201910995729.XA Division CN110737435B (en) 2019-10-18 2019-10-18 Method, device, terminal equipment and storage medium for editing multimedia in game

Publications (2)

Publication Number Publication Date
CN111124401A true CN111124401A (en) 2020-05-08
CN111124401B CN111124401B (en) 2023-09-26

Family

ID=69270346

Family Applications (4)

Application Number Title Priority Date Filing Date
CN201911114316.2A Active CN111124402B (en) 2019-10-18 2019-10-18 Method, device, terminal equipment and storage medium for editing multimedia in game
CN201911115374.7A Active CN111124403B (en) 2019-10-18 2019-10-18 Method, device, terminal equipment and storage medium for editing multimedia in game
CN201910995729.XA Active CN110737435B (en) 2019-10-18 2019-10-18 Method, device, terminal equipment and storage medium for editing multimedia in game
CN201911114289.9A Active CN111124401B (en) 2019-10-18 2019-10-18 Method, device, terminal equipment and storage medium for editing multimedia in game

Family Applications Before (3)

Application Number Title Priority Date Filing Date
CN201911114316.2A Active CN111124402B (en) 2019-10-18 2019-10-18 Method, device, terminal equipment and storage medium for editing multimedia in game
CN201911115374.7A Active CN111124403B (en) 2019-10-18 2019-10-18 Method, device, terminal equipment and storage medium for editing multimedia in game
CN201910995729.XA Active CN110737435B (en) 2019-10-18 2019-10-18 Method, device, terminal equipment and storage medium for editing multimedia in game

Country Status (1)

Country Link
CN (4) CN111124402B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112256251A (en) * 2020-10-29 2021-01-22 北京冰封互娱科技有限公司 Game data processing method, game data processing device, main body object configuration method, main body object configuration device, and storage medium

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111399711A (en) * 2020-03-10 2020-07-10 广州通达汽车电气股份有限公司 Interface editing method, device, equipment and storage medium
CN113694531B (en) * 2020-05-21 2024-01-19 抖音视界有限公司 Game special effect generation method and device, electronic equipment and computer readable medium
CN112044061B (en) * 2020-08-11 2022-05-06 腾讯科技(深圳)有限公司 Game picture processing method and device, electronic equipment and storage medium
CN112073799B (en) * 2020-08-31 2022-07-01 腾讯数码(天津)有限公司 Virtual resource management method and device, computer equipment and readable storage medium
CN112118397B (en) * 2020-09-23 2021-06-22 腾讯科技(深圳)有限公司 Video synthesis method, related device, equipment and storage medium
CN112169314A (en) * 2020-10-20 2021-01-05 网易(杭州)网络有限公司 Method and device for selecting target object in game
CN112843723B (en) * 2021-02-03 2024-01-16 北京字跳网络技术有限公司 Interaction method, interaction device, electronic equipment and storage medium
CN114845171A (en) * 2022-03-21 2022-08-02 维沃移动通信有限公司 Video editing method and device and electronic equipment
CN116634233B (en) * 2023-04-12 2024-02-09 北京七彩行云数字技术有限公司 Media editing method, device, equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030064801A1 (en) * 2001-09-28 2003-04-03 Igt Decoupling of the graphical presentation of a game from the presentation logic
CN109446346A (en) * 2018-09-14 2019-03-08 传线网络科技(上海)有限公司 Multimedia resource edit methods and device
CN109582311A (en) * 2018-11-30 2019-04-05 网易(杭州)网络有限公司 A kind of UI is edited in game method and device, electronic equipment, storage medium

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2404300A (en) * 2003-07-25 2005-01-26 Autodesk Inc Compositing and temporally editing clips of image data
CN102693091A (en) * 2012-05-22 2012-09-26 深圳市环球数码创意科技有限公司 Method for realizing three dimensional virtual characters and system thereof
CN202929567U (en) * 2012-11-19 2013-05-08 深圳市数虎图像科技有限公司 Virtual character animation performance system
EP3092623A4 (en) * 2014-01-09 2017-08-30 Square Enix Holdings Co., Ltd. Online game server architecture using shared rendering
CN108355355A (en) * 2018-03-16 2018-08-03 深圳冰川网络股份有限公司 A kind of control method and system of 3D sports class online game
CN108961368A (en) * 2018-06-21 2018-12-07 珠海金山网络游戏科技有限公司 The method and system of real-time live broadcast variety show in three-dimensional animation environment
CN109513212B (en) * 2018-11-19 2020-06-12 苏州好玩友网络科技有限公司 2D mobile game UI (user interface) and scenario editing method and system
CN109756511B (en) * 2019-02-02 2021-08-31 珠海金山网络游戏科技有限公司 Data processing method and device, computing equipment and storage medium
CN110062271B (en) * 2019-04-28 2022-03-04 腾讯科技(成都)有限公司 Scene switching method, device, terminal and storage medium
CN110227267B (en) * 2019-06-28 2023-02-28 百度在线网络技术(北京)有限公司 Voice skill game editing method, device and equipment and readable storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030064801A1 (en) * 2001-09-28 2003-04-03 Igt Decoupling of the graphical presentation of a game from the presentation logic
CN109446346A (en) * 2018-09-14 2019-03-08 传线网络科技(上海)有限公司 Multimedia resource edit methods and device
CN109582311A (en) * 2018-11-30 2019-04-05 网易(杭州)网络有限公司 A kind of UI is edited in game method and device, electronic equipment, storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112256251A (en) * 2020-10-29 2021-01-22 北京冰封互娱科技有限公司 Game data processing method, game data processing device, main body object configuration method, main body object configuration device, and storage medium

Also Published As

Publication number Publication date
CN111124402A (en) 2020-05-08
CN111124401B (en) 2023-09-26
CN110737435A (en) 2020-01-31
CN110737435B (en) 2024-04-19
CN111124403A (en) 2020-05-08
CN111124403B (en) 2023-09-26
CN111124402B (en) 2023-09-26

Similar Documents

Publication Publication Date Title
CN111124401B (en) Method, device, terminal equipment and storage medium for editing multimedia in game
CN109011574B (en) Game interface display method, system, terminal and device based on live broadcast
CN112334886B (en) Content distribution system, content distribution method, and recording medium
CN105210373B (en) Provide a user the method and system of personalized channels guide
JP2016137106A (en) Construction method, program, information processing system, and information processing device
TWI831074B (en) Information processing methods, devices, equipments, computer-readable storage mediums, and computer program products in virtual scene
CN112657186B (en) Game interaction method and device
CN111897483A (en) Live broadcast interaction processing method, device, equipment and storage medium
US20220277493A1 (en) Content generation system and method
TWI329030B (en) Image generation device, character appearance changing method, and information recording medium
CN113069759A (en) Scene processing method and device in game and electronic equipment
CN111530087B (en) Method and device for generating real-time expression package in game
KR100932675B1 (en) Method of video contents manipulation
CN112150602A (en) Model image rendering method and device, storage medium and electronic equipment
CN115624740A (en) Virtual reality equipment, control method, device and system thereof, and interaction system
EP4319144A1 (en) Information processing system, information processing method, and information processing program
CN115665436A (en) Method, device, equipment and storage medium for online live broadcast
CN114021047A (en) Information presentation method in immersive activity system and electronic device
JP2021501612A (en) Sketch routine for video games
KR20100096605A (en) Method and system for providing game service by avatar motion editing
JP7078585B2 (en) Game programs, methods, and information processing equipment
WO2022137340A1 (en) Information processing method, computer-readable medium, and information processing device
JP6923726B1 (en) Methods, computer-readable media, and information processing equipment
CN117482531A (en) Method and device for processing motion editing in game, storage medium and electronic equipment
CN117138346A (en) Game editing method, game control device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant