CN111491179A - Game video editing method and device - Google Patents
Game video editing method and device Download PDFInfo
- Publication number
- CN111491179A CN111491179A CN202010302098.1A CN202010302098A CN111491179A CN 111491179 A CN111491179 A CN 111491179A CN 202010302098 A CN202010302098 A CN 202010302098A CN 111491179 A CN111491179 A CN 111491179A
- Authority
- CN
- China
- Prior art keywords
- game
- confrontation
- picture
- video
- frame
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 76
- 238000012545 processing Methods 0.000 claims description 18
- 230000004927 fusion Effects 0.000 claims description 7
- 239000000284 extract Substances 0.000 abstract description 10
- 239000008280 blood Substances 0.000 description 109
- 210000004369 blood Anatomy 0.000 description 109
- 238000005516 engineering process Methods 0.000 description 17
- 238000010586 diagram Methods 0.000 description 14
- 230000006870 function Effects 0.000 description 10
- 238000013473 artificial intelligence Methods 0.000 description 6
- 230000015572 biosynthetic process Effects 0.000 description 4
- 239000003086 colorant Substances 0.000 description 4
- 238000005755 formation reaction Methods 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 208000007536 Thrombosis Diseases 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 238000001816 cooling Methods 0.000 description 2
- 238000007726 management method Methods 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 208000015041 syndromic microphthalmia 10 Diseases 0.000 description 2
- 238000012935 Averaging Methods 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000007599 discharging Methods 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000003058 natural language processing Methods 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/234—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/60—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/234—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
- H04N21/23418—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/44008—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Processing Or Creating Images (AREA)
Abstract
The application discloses a game video editing method and device; the method and the device can acquire an original game video and extract a multi-frame reference game picture from the original game video; aiming at each frame of reference game picture, identifying the position information of the game role in the game scene in the reference game picture and the game opponent information of the game role; calculating the confrontation parameter of each frame of reference game picture based on the number of game characters in the game scene and the distance between the game characters when the game confrontation party confronts; based on the confrontation parameters of each frame of reference game picture, a target game video segment is cut out from the original game video. The method and the device can utilize the characteristics of the game countermeasures, and calculate the countermeasures intensity of each game countermeasure based on the distance between the game roles in different game countermeasures and the number of the game roles during the countermeasures so as to determine the wonderful game segment, thereby greatly improving the accuracy and speed of editing the wonderful game segment.
Description
Technical Field
The application relates to the technical field of computers, in particular to a game video editing method and device.
Background
Along with the continuous improvement of living standard of people, the entertainment demand of people is higher and higher, and many people like playing electronic games in spare time. In the process of playing games, users usually have the requirement of sharing wonderful game videos played by themselves to others for watching. However, for some games, a game is too long, which may be more than half an hour, and it is inconvenient to share the game with others, so that the highlight segments in the game video need to be clipped out, and the video with shorter duration needs to be clipped out and shared with others.
In the prior art, a mode of manual editing is generally adopted to cut wonderful segments in a game video, so that the workload of editing is large; there are also game segments that are cut by detecting game events in the game video, where the game events may include killing, attack assistance, and highlight text, but the cutting method based on the game events is too monotonous and not accurate. A large amount of wonderful moments come from the process of playing or fighting the wonderful games of both parties, killing is not generated in the process, and the wonderful shots cannot be captured by a clipping mode based on game events.
Disclosure of Invention
The embodiment of the application provides a game video editing method and device, which can greatly improve the accuracy and speed of the editing of a wonderful game segment.
The embodiment of the application provides a game video clipping method, which comprises the following steps:
acquiring an original game video, and extracting a plurality of frames of reference game pictures from the original game video;
aiming at each frame of reference game picture, identifying the position information of the game role in the game scene in the reference game picture and the game opponent information of the game role;
aiming at each frame of reference game picture, calculating a confrontation parameter of the reference game picture based on the number of game characters in the game scene and the distance between the game characters when the confrontation parties confront with each other, and obtaining the confrontation parameter of each frame of reference game picture, wherein the confrontation parameter represents the confrontation intensity of each game confrontation party in each frame of reference game picture;
based on the confrontation parameters of each frame of reference game picture, a target game video segment is cut out from the original game video.
Accordingly, an embodiment of the present application provides a game video clip device, including:
the device comprises an acquisition unit, a processing unit and a display unit, wherein the acquisition unit is used for acquiring an original game video and extracting a plurality of frames of reference game pictures from the original game video;
the identification unit is used for identifying the position information of the game role in the game scene in the reference game picture and the game counter party information of the game role in each frame of reference game picture;
the computing unit is used for computing a confrontation parameter of each frame of reference game picture according to the number of game characters in the game scene and the distance between the game characters when the confrontation parties confront with each other, and the confrontation parameter represents the confrontation intensity of each game confrontation party in each frame of reference game picture;
and the cutting unit is used for cutting out a target game video clip from the original game video based on the confrontation parameter of each frame of reference game picture.
Optionally, in some embodiments of the present application, the identification unit may include a first obtaining subunit, a first identification subunit, and a second identification subunit, as follows:
the first obtaining subunit is configured to obtain, for each frame of reference game picture, a preset life value indication template of a game character in a game scene;
the first identification subunit is used for identifying the position information of the game role in the game scene in the reference game picture based on the sliding preset life value indication template;
and the second identification subunit is used for identifying the game party to which the game role in the reference game picture belongs to obtain the information of the game party to which the game role in the reference game picture belongs.
Optionally, in some embodiments of the application, the second identifying subunit may be specifically configured to obtain color information of a life value indication template of a game character in the reference game image; and determining game counter party information to which the game character belongs in the reference game picture based on the color information of the life value indication template.
Optionally, in some embodiments of the application, the first identifying subunit may be specifically configured to identify, based on a sliding of a preset life value indication template on a reference game screen, a position of a life value indication template of a game character in the reference game screen; and acquiring the position information of the game role in the game scene in the reference game picture based on the position of the life value indication template.
Optionally, in some embodiments, the step "identifying the position of the life value indication template of the game character in the reference game screen based on the sliding of the preset life value indication template on the reference game screen" may specifically include:
based on the sliding of a preset life value indication template on a reference game picture, calculating the gray level similarity of an overlapped area of the preset life value indication template and the reference game picture after the sliding;
and determining the position of the life value indication template of the game character in the reference game picture based on the gray similarity.
Optionally, in some embodiments, the step "determining the position of the life value indication template of the game character in the reference game screen based on the grayscale similarity" may include:
generating a matching graph corresponding to the reference game picture based on the gray level similarity, wherein the matching graph comprises a plurality of pixel points, and the values of the pixel points represent the gray level similarity of a preset life value indication template in the overlapping area of the pixel points and the reference game picture;
determining pixel points with the gray similarity higher than a preset similarity in the matching image as matching pixel points;
and determining the position of the life value indication template of the game role in the reference game picture based on the positions of the matched pixel points in the matching graph corresponding to the reference game picture.
Optionally, in some embodiments of the present application, the calculating unit may include a second obtaining subunit, a calculating subunit, and a first fusing subunit, as follows:
the second acquiring subunit is configured to acquire, for each frame of reference game picture, a distance between game characters when different game participants play against each other;
a calculating subunit, configured to calculate, based on the distance, a local confrontation parameter of the game character in each game confrontation;
and the first fusion subunit is used for obtaining the confrontation parameters of the reference game picture by fusing the local confrontation parameters of each game role in each game confrontation party.
Optionally, in some embodiments, the calculating subunit may be specifically configured to, for each game role in each game opponent, perform weighting processing on a distance from the game role in the game opponent to each game role in other game opponents to obtain a local confrontation parameter of the game role in the game opponent, and obtain a local confrontation parameter of the game role in each game opponent.
Optionally, in some embodiments of the present application, the clipping unit may include a dividing subunit, a second fusing subunit, and a determining subunit, as follows:
the dividing subunit is configured to divide the original game video into a plurality of video segments according to a preset time interval;
the second fusion subunit is used for fusing the confrontation parameters of the reference game pictures in each video clip to obtain the confrontation parameters after the video clips are fused;
and the determining subunit is used for determining the target game video clip from the plurality of video clips based on the merged confrontation parameter.
The electronic device provided by the embodiment of the application comprises a processor and a memory, wherein the memory stores a plurality of instructions, and the processor loads the instructions to execute the steps in the game video clipping method provided by the embodiment of the application.
In addition, the embodiment of the application also provides a storage medium, on which a computer program is stored, wherein the computer program realizes the steps in the game video clipping method provided by the embodiment of the application when being executed by a processor.
The embodiment of the application provides a game video clipping method and device, which can acquire an original game video and extract a multi-frame reference game picture from the original game video; aiming at each frame of reference game picture, identifying the position information of the game role in the game scene in the reference game picture and the game opponent information of the game role; aiming at each frame of reference game picture, calculating a confrontation parameter of the reference game picture based on the number of game characters in the game scene and the distance between the game characters when the confrontation parties confront with each other, and obtaining the confrontation parameter of each frame of reference game picture, wherein the confrontation parameter represents the confrontation intensity of each game confrontation party in each frame of reference game picture; based on the confrontation parameters of each frame of reference game picture, a target game video segment is cut out from the original game video. The method and the device can utilize the characteristics of the game countermeasures, and calculate the countermeasures intensity of each game countermeasure based on the distance between the game roles in different game countermeasures and the number of the game roles during the countermeasures so as to determine the wonderful game segment, thereby greatly improving the accuracy and speed of editing the wonderful game segment.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1a is a schematic view of a scene of a game video clipping method provided in an embodiment of the present application;
FIG. 1b is a flowchart of a method for editing video in a game provided by an embodiment of the present application;
FIG. 1c is a schematic diagram of a template of a game video clipping method provided in an embodiment of the present application;
FIG. 1d is a schematic illustration showing a video clip method for a game provided by an embodiment of the present application;
FIG. 1e is another schematic illustration showing a video clip method for a game provided by an embodiment of the present application;
FIG. 1f is a matching diagram of a game video clipping method provided by an embodiment of the present application;
FIG. 1g is a matching diagram of a processed game video clip method according to an embodiment of the present application;
FIG. 1h is another schematic illustration of a game video clip method provided by an embodiment of the present application;
FIG. 1i is another schematic diagram of a game video clip method provided by an embodiment of the present application;
FIG. 1j is a schematic diagram of another display of a game video clipping method according to an embodiment of the present application;
FIG. 1k is another schematic illustration of a display of a game video clipping method according to an embodiment of the present application;
FIG. 1l is a diagram of a confrontational parameter distribution of a game video clipping method according to an embodiment of the present application;
FIG. 2 is another flow chart of a method for game video clipping provided by an embodiment of the present application;
FIG. 3a is a schematic diagram of a game video editing apparatus according to an embodiment of the present application;
FIG. 3b is a schematic diagram of another structure of a game video clip device according to an embodiment of the present application;
FIG. 3c is a schematic diagram of another structure of a game video clip device according to an embodiment of the present application;
FIG. 3d is a schematic diagram of another structure of a game video clip device according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of an electronic device provided in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The embodiment of the application provides a game video clipping method and device. Specifically, the embodiment of the present application provides a game video clip device suitable for an electronic device, which may be a terminal or a server or the like.
It is understood that the game video clip method of the present embodiment may be executed on the terminal, may be executed on the server, or may be executed by both the terminal and the server.
Referring to fig. 1a, a terminal and a server together execute a game video clip method as an example. The game video clip system provided by the embodiment of the application comprises a terminal 10, a server 11 and the like; the terminal 10 and the server 11 are connected via a network, e.g. a wired or wireless network connection, etc., wherein the game video clipping device may be integrated in the server.
The terminal 10 may record an original game video, and send a cutting instruction of the highlight game segment and the original game video to the server 11, so that the server 11 extracts a plurality of reference game frames of the original game video, calculates a confrontation parameter for each reference game frame, cuts the highlight game segment from the original game video based on the confrontation parameter of each reference game frame, and returns the highlight game segment to the terminal 10. The terminal 10 may include a game machine, a mobile phone, a smart tv, a tablet Computer, a notebook Computer, a Personal Computer (PC), or the like.
The server 11 may be configured to: acquiring an original game video, namely a video needing to cut a wonderful game segment, and extracting a plurality of frames of reference game pictures from the original game video; aiming at each frame of reference game picture, identifying the position information of the game role in the game scene in the reference game picture and the game opponent information of the game role; aiming at each frame of reference game picture, calculating a confrontation parameter of the reference game picture based on the number of game characters in the game scene and the distance between the game characters when the confrontation parties confront with each other, and obtaining the confrontation parameter of each frame of reference game picture, wherein the confrontation parameter represents the confrontation intensity of each game confrontation party in each frame of reference game picture; cutting out a target game video clip from the original game video based on the confrontational parameters of each frame of reference game picture, wherein the target game video clip is a wonderful game clip of the original game video; and then transmits the highlight game piece to the terminal 10. The server 11 may be a single server or a server cluster including a plurality of servers.
The process of clipping the target game video clip by the server 11 described above may also be executed by the terminal 10.
The game video clipping method provided by the embodiment of the application relates to Computer Vision technology (CV) in the field of Artificial Intelligence (AI). According to the embodiment of the application, the characteristics of the game countermeasures can be utilized, the countermeasures intensity of each game countermeasure is calculated based on the distance between the game roles in different game countermeasures and the number of the game roles during the countermeasures, the wonderful game segment is determined, and the accuracy and speed of editing the wonderful game segment are greatly improved.
Among them, Artificial Intelligence (AI) is a theory, method, technique and application system that simulates, extends and expands human Intelligence using a digital computer or a machine controlled by a digital computer, senses the environment, acquires knowledge and uses the knowledge to obtain the best result. In other words, artificial intelligence is a comprehensive technique of computer science that attempts to understand the essence of intelligence and produce a new intelligent machine that can react in a manner similar to human intelligence. Artificial intelligence is the research of the design principle and the realization method of various intelligent machines, so that the machines have the functions of perception, reasoning and decision making. The artificial intelligence technology is a comprehensive subject and relates to the field of extensive technology, namely the technology of a hardware level and the technology of a software level. The artificial intelligence software technology mainly comprises a computer vision technology, a voice processing technology, a natural language processing technology, machine learning/deep learning and other directions.
Computer Vision technology (CV) is a science for researching how to make a machine "see", and more specifically, it refers to that a camera and a Computer are used to replace human eyes to perform machine Vision such as identification, tracking and measurement on a target, and further perform graphic processing, so that the Computer processing becomes an image more suitable for human eyes to observe or transmitted to an instrument to detect. As a scientific discipline, computer vision research-related theories and techniques attempt to build artificial intelligence systems that can capture information from images or multidimensional data. Computer vision technologies generally include image processing, image recognition, image semantic understanding, image retrieval, OCR, video processing, video semantic understanding, video content/behavior recognition, three-dimensional object reconstruction, 3D technologies, virtual reality, augmented reality, synchronous positioning, map construction, and other technologies, and also include common biometric technologies such as face recognition and fingerprint recognition.
The following are detailed below. It should be noted that the following description of the embodiments is not intended to limit the preferred order of the embodiments.
The embodiment of the present application will be described from the perspective of a game video clipping device, which may be specifically integrated in an electronic device, where the electronic device may be a server or a terminal.
The game video clipping method can be applied to various scenes needing clipping of the game highlight video segments, for example, when a user plays a game, the user wants to share the game video played by the user with other people, but the game video is relatively long in time, so that the game video is not suitable for sharing the whole game video with other people. The game video can be clipped through the method for clipping the game video provided by the embodiment to obtain the wonderful game video segment, the wonderful segment of the game video can be more accurately clipped through the method, the clipping speed is high, the clipping video only containing the wonderful segment can be automatically clipped and generated for the long game video, and a large amount of manual clipping work is reduced.
As shown in fig. 1b, a specific flow of the game video clipping method is as follows, and the game video clipping method may be executed by a server or a terminal, which is not limited in this embodiment.
101. An original game video is obtained, and a multi-frame reference game picture is extracted from the original game video.
In this embodiment, the original game video is a game video that needs to be cut, that is, a video to be cut, and a target game video segment is obtained by cutting the original game video, where the target game video segment may be a highlight segment of the original game video. The Game type of the original Game video is not limited, and the Game scene may be a Battle type Game scene, for example, a large Multiplayer Online Role Playing Game (MORPG) scene, a Sports Game (SPG) scene, a Multiplayer Online Arena Game (MOBA) scene, and the like, which is not limited in this embodiment.
There are various ways to obtain the original game video.
For example, the original game video may be obtained by a video recording device on the electronic device, for example, when receiving a game video recording request instruction, the video recording device is activated to record the game video, and the recorded game video is used as the original game video.
For example, the original game video may also be obtained from a database local to the electronic device, for example, the original game video is stored in the database local to the electronic device, and when a cut instruction for a highlight game segment of the original game video is received, the original game video may be directly obtained from the database local to the electronic device, where the local refers to the electronic device.
For example, the original game video may be obtained via the internet and provided to the game video editing apparatus, for example, by downloading via the internet.
For example, the original game video may also be obtained by other devices and then provided to the game video clipping device, that is, the game video clipping device may specifically receive the original game video sent by other devices, such as other terminals.
For video, the video content is composed of a series of video Frames, the video is usually expressed as Frames Per Second (FPS), each frame of video frame is a still image, and when the video Frames are played in order, a moving image can be created. For example, video content created using an FPS of 30 means that 30 "still images" per second will be played as video.
In this embodiment, the original game video includes a plurality of video frames, the video frames are game pictures, a part of the video frames may be extracted from the original game video as reference game pictures according to a preset time interval, and the preset time interval may be specifically set according to an actual situation. For example, the original game video has 35 frames of video frames per second, and the decimation may be performed at 5 frame intervals, i.e., 7 frames of video frames per second as the reference game picture.
102. For each frame of reference game picture, identifying the position information of the game character in the game scene in the reference game picture and the game opponent information of the game character.
The game role is specifically a virtual role in a game scene, and can be a virtual character or a virtual animal. The game party refers to game play, in general games, game characters can be divided into different plays, such as a red play and a blue play, the number of the game plays is usually two or more, and each game play comprises at least one game character. The confrontation between different game plays generally represents a wonderful segment of the game. In the target game scene, a plurality of game roles in the same game formation are in a mutual cooperation relationship, and game roles in different game formations are in a mutual competition relationship.
In the game scene, visual elements attracting game players are generally constructed, and positions of game characters can be found by the visual elements, or parts with minimum changes of the objects to be detected can be found by the visual elements. For example, a game character is usually provided with a life value indicating template on the top, the life value indicating template represents the battle value or life value of the game character, and when the battle value or life value is lower than a certain value, the game character corresponding to the life value indicating template exits the battle. In particular, the life value indication template may be a blood strip, i.e. a blood strip of a fixed shape worn overhead by the game character. Since the life value indicates that the template is above the game character, the position of the game character may be determined based on the position of the life value indicating template.
Optionally, in some embodiments, the step "identifying, for each frame of the reference game picture, position information of the game character in the game scene in the reference game picture and game counterparty information to which the game character belongs" may include:
aiming at each frame of reference game picture, acquiring a preset life value indication template of a game role in a game scene;
identifying position information of a game role in a game scene in a reference game picture based on the sliding preset life value indication template;
and identifying the game counter party to which the game role in the reference game picture belongs to obtain the information of the game counter party to which the game role in the reference game picture belongs.
In some embodiments, the step of "identifying the position information of the game character in the game scene in the reference game screen based on the sliding preset life value indication template" may include:
identifying the position of a life value indication template of a game character in a reference game picture based on the sliding of a preset life value indication template on the reference game picture;
and acquiring the position information of the game role in the game scene in the reference game picture based on the position of the life value indication template.
The preset life value indication template may be a blood strip template. The top of the game role is provided with a life value indicating template, namely a blood streak, the blood streak indicates the life value or the battle value of the game role corresponding to the blood streak, and the appearance contour of the blood streak corresponding to each game role is the same, so the positions of the life value indicating templates corresponding to all the game roles, namely the positions corresponding to the blood streak, can be searched in the game picture by a template matching method, and the positions of the game roles corresponding to the blood streak in the reference game picture can be positioned according to the positions of the blood streak.
The template matching is a technology for finding a part which is most matched (similar) with another template image in one image, and by applying the template matching technology, the part which is most matched (similar) with the blood streak template in the reference game picture can be found based on the sliding of the blood streak template on the reference game picture, wherein the part is the position of the blood streak corresponding to the game role in the reference game picture, and the position below the blood streak position is the position of the game role in the game scene.
The preset life value indicating template, namely the blood strip template, can be set according to actual conditions, and this embodiment does not limit this. For example, since the contents such as the amount of blood, the color, and the cooling time in the blood streak vary, a portion where the blood streak does not change (the same portion) can be set as a blood streak template to perform template matching, and a portion common to the blood streak in each game character is set as a blood streak template in each case, so that the blood streak of the game character in each case can be recognized as much as possible. As shown in fig. 1c, the schematic diagram of the blood strip template shows that the blood volume changes during the game process, and only the blood volume and the number level square frame with more than one frame in front of the blood strip are taken as the blood strip template, so that the feature of detecting the shape of the blood strip of the game character only in the reference game picture can be ensured by the blood strip template, and the situation that the blood volume of the blood strip is different at each moment is considered. In addition, the blood strip template can satisfy the matching of blood strips under various special scenes, for example, as shown in fig. 1d, in the case that the edge area of the blood strip in the game scene is cut off, the position of the blood strip of the game character can still be detected through the blood strip template, because the blood strip template only uses a part of the matched blood strip; other situations such as a plurality of game character blood strips being folded can be identified, and are not listed here.
Optionally, in some embodiments, the step "identifying the position of the life value indication template of the game character in the reference game screen based on the sliding of the preset life value indication template on the reference game screen" may specifically include:
based on the sliding of a preset life value indication template on a reference game picture, calculating the gray level similarity of an overlapped area of the preset life value indication template and the reference game picture after the sliding;
and determining the position of the life value indication template of the game character in the reference game picture based on the gray similarity.
Optionally, in some embodiments, the step "determining the position of the life value indication template of the game character in the reference game screen based on the grayscale similarity" may include:
generating a matching graph corresponding to the reference game picture based on the gray level similarity, wherein the matching graph comprises a plurality of pixel points, and the values of the pixel points represent the gray level similarity of a preset life value indication template in the overlapping area of the pixel points and the reference game picture;
determining pixel points with the gray similarity higher than a preset similarity in the matching image as matching pixel points;
and determining the position of the life value indication template of the game role in the reference game picture based on the positions of the matched pixel points in the matching graph corresponding to the reference game picture.
In some embodiments, the blood bars of the game character have different colors, for example, different game participants adopt different blood bar colors — red blood bars and blue blood bars, before the step "sliding the preset life value indication template on the reference game picture based on the preset life value, calculating the gray level similarity of the overlapped area of the preset life value indication template and the reference game picture after sliding", the reference game picture needs to be subjected to gray level processing, the reference game picture is converted from a color image to a gray level image, and then is used for template matching.
The matching graph is specifically a mapping of the gray level similarity of the overlapping area of the preset life value indication template and the reference game picture, and the matching graph and the corresponding reference game picture have the same size. The higher the gray level similarity is, the higher the matching degree of the preset life value template at the pixel point is, that is, the higher the probability that the pixel point is the position of the blood strip is.
As shown in fig. 1e and fig. 1f, wherein fig. 1f is a matching graph generated by template matching in fig. 1 e. Wherein, the position pointed by the arrow is the position of the blood streak corresponding to the game role. Each pixel value in the matching image represents the matching degree of the blood strip template at the point and the reference game picture, the generated matching image may be subjected to threshold processing, a pixel point in the matching image with a gray level similarity higher than a preset similarity is determined as a matching pixel point, and the size of the preset similarity may be processed according to actual conditions, which is not limited in this embodiment, for example, it may be determined by the experience of a developer, and in this document, 0.5 is used as a threshold, and an effect graph after the threshold processing is shown in fig. 1g below. After thresholding, only a few small white dots (in the direction indicated by the arrow) exist in fig. 1g, the small white dots are matching pixel points, and it can be seen that a plurality of matching pixel points exist at the position of each blood streak after thresholding, so that the matching pixel points close in position need to be merged, and one of the matching pixel points in each region is taken as the starting point of the blood streak position.
The matching of the blood strip template can be conveniently and quickly completed by matching using a standard correlation matching (TM _ CCOEFF _ NORMED) pattern of a template matching function (MatchTemplate) of an Open Source computer Vision library (OpenCV) self-contained, wherein OpenCV is a BSD (Berkeley Software Distribution) based cross-platform computer Vision library which is allowed to be issued by an Open Source and can run on L inux, Windows, Android and Mac OS operating systems, wherein TM _ CCOEFF _ NORMED is a normalized correlation coefficient matching method, a positive value indicates a better matching result, a negative value indicates a poorer matching effect, a larger value indicates a better matching effect, and the template matching function chTemplate sequentially calculates the similarity in gray scale of an overlapping area of the blood strip template and a reference game picture, and stores the calculated result in an image, namely, each point in a MatchTemplate graph represents a similarity value.
Optionally, in some embodiments, the step of "identifying the game counterparty to which the game character belongs in the reference game screen, and obtaining the game counterparty information to which the game character belongs in the reference game screen" may specifically include:
acquiring color information of a life value indication template of a game character in the reference game picture;
and determining game counter party information to which the game character belongs in the reference game picture based on the color information of the life value indication template.
Wherein the vital value indicating template may be a blood strip. In a game scene, blood bars with different colors are usually adopted for distinguishing different game countermeasures, so that images of blood bar template areas (namely positions corresponding to the blood bars) corresponding to game characters can be extracted, color averaging processing is carried out on the images, and the game countermeasures are finally determined by judging the color range where the color average value is located. For example, there are two game participants a and b, the blood color of the game participant a is red, the blood color of the game participant b is blue, and if the extracted average color value of the image of the blood sample template region of a certain game character is closer to blue, the game character belongs to the game participant b.
103. Aiming at each frame of reference game picture, calculating the confrontation parameter of the reference game picture based on the number of game characters in the game scene and the distance between the game characters when the game confrontation parties confront with each other, and obtaining the confrontation parameter of each frame of reference game picture, wherein the confrontation parameter represents the confrontation intensity of each game confrontation party in each frame of reference game picture.
Wherein the confrontation parameter is a combat index which represents the confrontation intensity of each game confrontation in the reference game picture. In a game scenario, the size of the confrontation parameter is related to the number of game characters in the battle and the distance between the game characters in different game confrontation parties. Referring to fig. 1h, 1i, 1j and 1k, it can be seen that the greater the number of game characters, and the closer the game characters of different game opponents are, the greater the opponent parameter is. Fig. 1h, 1i, 1j and 1k show game scenes with reference to game screens, in which squares represent the borders on which the game characters are located, circles represent the game characters, white circles and black circles represent different game confrontation parties, bars above the circles represent the blood bars of the game characters, and the numbers in the front part of the blood bars represent the levels of the game characters. Optionally, in some embodiments, the level of the game character may also be considered in calculating the confrontation parameter.
Alternatively, in some embodiments, the step of "calculating a confrontation parameter of a reference game picture for each frame based on the number of game characters in the game scene and the distance between the game characters when the game confrontation party confronts" may include:
aiming at each frame of reference game picture, obtaining the distance between the game roles when different game counterweights play;
calculating local confrontation parameters of game characters in the game confrontation parties based on the distance;
and obtaining the confrontation parameters of the reference game picture by fusing the local confrontation parameters of each game role in each game confrontation party.
The distance between the game characters can be replaced by the distance between the blood bars corresponding to the game characters. Specifically, the distance between the game characters when the different game participants play the game may be the distance between the starting points of the blood clots corresponding to the game characters.
The local confrontation parameters may be fused by adding all the local confrontation parameters in the reference game picture to obtain the confrontation parameters of the reference game picture. Wherein the local confrontation parameters of the game character in the game confrontation and the game characters of other game confrontation can be calculated based on the distance between the game character in the game confrontation and each game character of other game confrontation at the time of confrontation.
Optionally, in some embodiments, the step of "calculating a local confrontation parameter of the game character in each game confrontation based on the distance" may specifically include:
and for each game role in each game opponent, carrying out weighting processing on the distance from the game role in the game opponent to each game role in other game opponents to obtain the local confrontation parameters of the game role in the game opponent and obtain the local confrontation parameters of the game role in each game opponent.
Specifically, in a game scene of a reference game picture, there are two game confrontation parties, one is a red side and one is a blue side, and the confrontation parameter calculation process of the reference game picture is as shown in the following equations (1) (2):
wherein score represents the confrontation parameter of the reference game picture, d represents the distance between the game characters of the confrontation parties of different games, specifically the Euclidean distance between the starting points of two blood bars, LredIndicating the number of game characters in the red, LblueIndicating the number of game characters in the blue,the abscissa representing the position of the game character in the red square corresponding to the blood streak,a vertical coordinate representing the position of the game character corresponding to the blood streak in the red square,an abscissa representing the position of the game character in the blue square corresponding to the blood streak,to representThe vertical coordinate of the position of the blood bar corresponding to the game character in the blue square, w and h respectively represent the width and height of a video frame, namely a reference game picture.
Equation (1) shows that when the blood streak positions of different campuses are less than a certain distance T, the local confrontation parameter is calculated through the distance, and when the blood streak position interval of different campuses is not less than the distance T, the local confrontation parameter is 0, wherein the distance T can be set according to the actual situation.
104. Based on the confrontation parameters of each frame of reference game picture, a target game video segment is cut out from the original game video.
In one embodiment, the original game video calculates the confrontation parameter at 10 frame intervals, i.e. the reference video frames are extracted at 10 frame intervals, resulting in the confrontation parameter distribution map shown in fig. 1 l. Wherein, the abscissa is the video frame number, and the ordinate is the confrontation parameter, namely the combat index. It can be seen that the battle of different fierce game battles occurs between 1100 frames and 2100 frames, and the point (for example, 1250 frames) with the middle individual index of 0 is the battle in which both sides are pulled out of position, but also belongs to the battle segment.
Alternatively, in some embodiments, the step of "cutting out the target game video clip from the original game video based on the confrontation parameter of each frame of the reference game picture" may include:
dividing the original game video into a plurality of video segments according to a preset time interval;
fusing the confrontation parameters of the reference game pictures in each video clip to obtain the confrontation parameters after the fusion of the video clips;
and determining a target game video clip from a plurality of video clips based on the fused confrontation parameter.
The preset time interval may be set according to an actual situation, which is not limited in this embodiment, for example, the post-fusion confrontation parameter in each second may be counted in units of seconds. The parameters of the merged confrontation can be calculated at intervals of a certain frame within each second, that is, the confrontation parameters of the reference video pictures within each second can be merged, and if 35 frames per second and 5 frames are taken as examples, the merged confrontation parameters obtained by 7 reference game pictures are counted per second. The method calculates the merged confrontation parameters by taking the frame interval as a unit, so that the situation that whether the segment in the current second can be judged as the target game video segment or not is influenced by the confrontation parameter of 0 caused by the fight pulling of the two parties in the individual combat segment in the picture 1l, and the influences can be reduced by merging the confrontation parameters.
For example, an average value of the confrontation parameters of the reference game pictures in the video clip may be calculated, and the average value is used as the confrontation parameter after the video clip is fused.
Optionally, the step "determining a target game video clip from a plurality of video clips based on the post-fusion confrontation parameter" may include:
and determining the video clip with the merged confrontation parameter larger than the preset value as a target game video clip.
The preset value may be set according to actual conditions, and this embodiment does not limit this. The video segments with the merged confrontation parameters larger than the preset value can be used as target game video segments, namely the wonderful segments of the original game video, and then all the wonderful segments are spliced and synthesized to obtain the final game wonderful highlight video.
Optionally, in some embodiments, a little time may be reserved before or after the highlight segment is edited, so that the video looks more smooth and natural.
The embodiment can extract the wonderful segment of the game video and automatically clip to generate the game wonderful highlight video. Specifically, blood streak positions of all game characters in the game picture are detected through image template matching, and the positions of the game characters in the game picture are determined. The method comprises the steps of distinguishing the formation where game roles are located through the colors of blood bars, calculating the distance between the blood bar starting positions corresponding to different formation game roles of the game at intervals of n frames of game pictures of game videos and the number of the game roles, calculating fighting indexes (fighting parameters) according to the characteristics that the positions of fighters are close to each other and the game roles are mostly large when the fighters fight against each other, extracting the wonderful moments in the game videos according to the fact that the fighting indexes are larger than a certain threshold value, and finally editing all wonderful segments to generate the final game videos. The embodiment can extract the video highlight through the image matching of the game role blood bars based on the characteristics of game fighting events, is high in accuracy and speed, can capture highlight shots of highlight fighting without killing and the like, can automatically clip and generate clip videos only containing the highlight shots for long videos of games, and reduces the work of a large number of manual clips.
As can be seen from the above, the present embodiment may acquire an original game video, and extract a multi-frame reference game picture from the original game video; aiming at each frame of reference game picture, identifying the position information of the game role in the game scene in the reference game picture and the game opponent information of the game role; aiming at each frame of reference game picture, calculating a confrontation parameter of the reference game picture based on the number of game characters in the game scene and the distance between the game characters when the confrontation parties confront with each other, and obtaining the confrontation parameter of each frame of reference game picture, wherein the confrontation parameter represents the confrontation intensity of each game confrontation party in each frame of reference game picture; based on the confrontation parameters of each frame of reference game picture, a target game video segment is cut out from the original game video. The method and the device can utilize the characteristics of the game countermeasures, and calculate the countermeasures intensity of each game countermeasure based on the distance between the game roles in different game countermeasures and the number of the game roles during the countermeasures so as to determine the wonderful game segment, thereby greatly improving the accuracy and speed of editing the wonderful game segment.
The method described in the previous embodiment will be described in further detail below with the game video clipping device specifically integrated in the server.
The embodiment of the application provides a game video clipping method, and as shown in fig. 2, a specific flow of the game video clipping method may be as follows:
201. and the server receives the original game video sent by the terminal.
In this embodiment, the original game video is a game video that needs to be cut, that is, a video to be cut, and a target game video segment is obtained by cutting the original game video, where the target game video segment may be a highlight segment of the original game video. The Game type of the original Game video is not limited, and the Game scene may be a Battle type Game scene, for example, a large Multiplayer Online Role Playing Game (MORPG) scene, a Sports Game (SPG) scene, a Multiplayer Online Arena Game (MOBA) scene, and the like, which is not limited in this embodiment.
202. The server extracts a plurality of reference game pictures from the original game video.
In this embodiment, the original game video includes a plurality of video frames, the video frames are game pictures, a part of the video frames may be extracted from the original game video as reference game pictures according to a preset time interval, and the preset time interval may be specifically set according to an actual situation. For example, the original game video has 35 frames of video frames per second, and the decimation may be performed at 5 frame intervals, i.e., 7 frames of video frames per second as the reference game picture.
203. The server identifies, for each frame of the reference game picture, position information of a game character in the game scene in the reference game picture, and game party information to which the game character belongs.
The game party refers to game play, in general games, game characters can be divided into different plays, such as a red play and a blue play, and the number of the game plays is usually two or more. The confrontation between different game plays generally represents a wonderful segment of the game.
Optionally, in some embodiments, the step "the server identifies, for each frame of the reference game picture, position information of the game character in the game scene in the reference game picture, and game counterparty information to which the game character belongs", may include:
aiming at each frame of reference game picture, acquiring a preset life value indication template of a game role in a game scene;
identifying position information of a game role in a game scene in a reference game picture based on the sliding preset life value indication template;
and identifying the game counter party to which the game role in the reference game picture belongs to obtain the information of the game counter party to which the game role in the reference game picture belongs.
In some embodiments, the step of "identifying the position information of the game character in the game scene in the reference game screen based on the sliding preset life value indication template" may include:
identifying the position of a life value indication template of a game character in a reference game picture based on the sliding of a preset life value indication template on the reference game picture;
and acquiring the position information of the game role in the game scene in the reference game picture based on the position of the life value indication template.
The preset life value indication template may be a blood strip template. The top of the game role is provided with a life value indicating template, namely a blood streak, the blood streak indicates the life value or the battle value of the game role corresponding to the blood streak, and the appearance contour of the blood streak corresponding to each game role is the same, so the positions of the life value indicating templates corresponding to all the game roles, namely the positions corresponding to the blood streak, can be searched in the game picture by a template matching method, and the positions of the game roles corresponding to the blood streak in the reference game picture can be positioned according to the positions of the blood streak.
The template matching is a technology for finding a part which is most matched (similar) with another template image in one image, and by applying the template matching technology, the part which is most matched (similar) with the blood streak template in the reference game picture can be found based on the sliding of the blood streak template on the reference game picture, wherein the part is the position of the blood streak corresponding to the game role in the reference game picture, and the position below the blood streak position is the position of the game role in the game scene.
The preset life value indicating template, namely the blood strip template, can be set according to actual conditions, and this embodiment does not limit this. For example, since the contents such as the amount of blood, the color, and the cooling time in the blood streak vary, a portion where the blood streak is not changed can be set as a blood streak template to perform template matching, and a portion common to the blood streak in each game character in different cases can be taken as the blood streak template, so that the blood streak of the game character in each case can be recognized as much as possible. As shown in fig. 1c, the schematic diagram of the blood strip template is that only the blood volume and the number level box one lot before the blood strip are taken as the blood strip template, and the blood strip template can ensure that the feature of the blood strip shape of the game character is detected only in the reference game picture, and also consider the situation that the blood volume of the blood strip is different at each moment. In addition, the blood strip template can satisfy the matching of blood strips under various special scenes, for example, as shown in fig. 1d, in the case that the edge area of the blood strip in the game scene is cut off, the position of the blood strip of the game character can still be detected through the blood strip template, because the blood strip template only uses a part of the matched blood strip; other situations such as a plurality of game character blood strips being folded can be identified, and are not listed here.
Optionally, in some embodiments, the step "identifying the position of the life value indication template of the game character in the reference game screen based on the sliding of the preset life value indication template on the reference game screen" may specifically include:
based on the sliding of a preset life value indication template on a reference game picture, calculating the gray level similarity of an overlapped area of the preset life value indication template and the reference game picture after the sliding;
and determining the position of the life value indication template of the game character in the reference game picture based on the gray similarity.
Optionally, in some embodiments, the step "determining the position of the life value indication template of the game character in the reference game screen based on the grayscale similarity" may include:
generating a matching graph corresponding to the reference game picture based on the gray level similarity, wherein the matching graph comprises a plurality of pixel points, and the values of the pixel points represent the gray level similarity of a preset life value indication template in the overlapping area of the pixel points and the reference game picture;
determining pixel points with the gray similarity higher than a preset similarity in the matching image as matching pixel points;
and determining the position of the life value indication template of the game role in the reference game picture based on the positions of the matched pixel points in the matching graph corresponding to the reference game picture.
The matching map is specifically a mapping of the gray level similarity of the overlapping area of the preset life value indication template and the reference game picture. The higher the gray level similarity is, the higher the matching degree of the preset life value template at the pixel point is, that is, the higher the probability that the pixel point is the position of the blood strip is.
Optionally, in some embodiments, the step of "identifying the game counterparty to which the game character belongs in the reference game screen, and obtaining the game counterparty information to which the game character belongs in the reference game screen" may specifically include:
acquiring color information of a life value indication template of a game character in the reference game picture;
and determining game counter party information to which the game character belongs in the reference game picture based on the color information of the life value indication template.
204. The server calculates the confrontation parameters of the reference game pictures aiming at each frame of reference game pictures based on the number of game characters in the game scene and the distance between the game characters when the game confrontation parties confront with each other, and obtains the confrontation parameters of each frame of reference game pictures, wherein the confrontation parameters represent the confrontation intensity of each game confrontation party in each frame of reference game pictures.
Alternatively, in some embodiments, the step of "calculating a confrontation parameter of a reference game picture for each frame based on the number of game characters in the game scene and the distance between the game characters when the game confrontation party confronts" may include:
aiming at each frame of reference game picture, obtaining the distance between the game roles when different game counterweights play;
calculating local confrontation parameters of game characters in the game confrontation parties based on the distance;
and obtaining the confrontation parameters of the reference game picture by fusing the local confrontation parameters of each game role in each game confrontation party.
The distance between the game characters can be replaced by the distance between the blood bars corresponding to the game characters. Specifically, the distance between the game characters when the different game participants play the game may be the distance between the starting points of the blood clots corresponding to the game characters.
205. The server cuts out a target game video segment from the original game video based on the confrontation parameter of each frame of reference game picture.
Alternatively, in some embodiments, the step of "cutting out the target game video clip from the original game video based on the confrontation parameter of each frame of the reference game picture" may include:
dividing the original game video into a plurality of video segments according to a preset time interval;
fusing the confrontation parameters of the reference game pictures in each video clip to obtain the confrontation parameters after the fusion of the video clips;
and determining a target game video clip from a plurality of video clips based on the fused confrontation parameter.
206. And the server sends the target game video clip to the terminal.
As can be seen from the above, the present embodiment may receive, through the server, the original game video sent by the terminal; extracting a plurality of frames of reference game pictures from the original game video; aiming at each frame of reference game picture, identifying the position information of the game role in the game scene in the reference game picture and the game opponent information of the game role; aiming at each frame of reference game picture, calculating a confrontation parameter of the reference game picture based on the number of game characters in the game scene and the distance between the game characters when the confrontation parties confront with each other, and obtaining the confrontation parameter of each frame of reference game picture, wherein the confrontation parameter represents the confrontation intensity of each game confrontation party in each frame of reference game picture; cutting out a target game video clip from the original game video based on the confrontation parameter of each frame of reference game picture; and the server sends the target game video clip to the terminal. The method and the device can utilize the characteristics of the game countermeasures, and calculate the countermeasures intensity of each game countermeasure based on the distance between the game roles in different game countermeasures and the number of the game roles during the countermeasures so as to determine the wonderful game segment, thereby greatly improving the accuracy and speed of editing the wonderful game segment.
In order to better implement the above method, an embodiment of the present application further provides a game video clipping device, as shown in fig. 3a, which may include an obtaining unit 301, an identifying unit 302, a calculating unit 303, and a clipping unit 304, as follows:
(1) an acquisition unit 301;
an acquisition unit 301 configured to acquire an original game video and extract a plurality of reference game pictures from the original game video.
(2) An identification unit 302;
the identifying unit 302 is configured to identify, for each frame of the reference game picture, position information of a game character in the game scene in the reference game picture, and game party information to which the game character belongs.
Optionally, in some embodiments of the present application, the identifying unit 302 may include a first acquiring subunit 3021, a first identifying subunit 3022, and a second identifying subunit 3023, see fig. 3b, as follows:
the first obtaining subunit 3021, configured to obtain, for each frame of reference game picture, a preset life value indication template of a game character in a game scene;
a first identifying subunit 3022, configured to identify, based on the sliding preset life value indication template, position information of a game character in the game scene in the reference game picture;
a second identifying subunit 3023, configured to identify a game party to which the game character belongs in the reference game screen, and obtain information of the game party to which the game character belongs in the reference game screen.
Optionally, in some embodiments of the present application, the second identifying subunit 3023 may be specifically configured to obtain color information of a life value indication template of a game character in the reference game screen; and determining game counter party information to which the game character belongs in the reference game picture based on the color information of the life value indication template.
Optionally, in some embodiments of the present application, the first identifying subunit 3022 may be specifically configured to identify, based on a sliding of a preset life value indication template on a reference game screen, a position of a life value indication template of a game character in the reference game screen; and acquiring the position information of the game role in the game scene in the reference game picture based on the position of the life value indication template.
Optionally, in some embodiments, the step "identifying the position of the life value indication template of the game character in the reference game screen based on the sliding of the preset life value indication template on the reference game screen" may specifically include:
based on the sliding of a preset life value indication template on a reference game picture, calculating the gray level similarity of an overlapped area of the preset life value indication template and the reference game picture after the sliding;
and determining the position of the life value indication template of the game character in the reference game picture based on the gray similarity.
Optionally, in some embodiments, the step "determining the position of the life value indication template of the game character in the reference game screen based on the grayscale similarity" may include:
generating a matching graph corresponding to the reference game picture based on the gray level similarity, wherein the matching graph comprises a plurality of pixel points, and the values of the pixel points represent the gray level similarity of a preset life value indication template in the overlapping area of the pixel points and the reference game picture;
determining pixel points with the gray similarity higher than a preset similarity in the matching image as matching pixel points;
and determining the position of the life value indication template of the game role in the reference game picture based on the positions of the matched pixel points in the matching graph corresponding to the reference game picture.
(3) A calculation unit 303;
a calculating unit 303, configured to calculate, for each frame of reference game picture, a confrontation parameter of the reference game picture based on the number of game characters in the game scene and a distance between the game characters when the game confrontation party confronts with each other, to obtain a confrontation parameter of each frame of reference game picture, where the confrontation parameter represents a confrontation severity of each game confrontation party in each frame of reference game picture.
Optionally, in some embodiments of the present application, the calculating unit 303 may include a second obtaining sub-unit 3031, a calculating sub-unit 3032, and a first fusing sub-unit 3033, see fig. 3c, as follows:
the second obtaining subunit 3031 is configured to obtain, for each frame of reference game picture, a distance between game characters when different game participants play the game;
a calculation subunit 3032, configured to calculate local confrontation parameters of the game character in each game confrontation party based on the distance;
a first fusion sub-unit 3033, configured to obtain the confrontation parameter of the reference game picture by fusing the local confrontation parameter of each game character in each game confrontation.
Optionally, in some embodiments, the calculating sub-unit 3032 may be specifically configured to, for each game role in each game counterparty, perform weighting processing on a distance from the game role in the game counterparty to each game role in other game counterparty, obtain a local confrontation parameter of the game role in the game counterparty, and obtain a local confrontation parameter of the game role in each game counterparty.
(4) A cutting unit 304;
a cutting unit 304, configured to cut out a target game video segment from the original game video based on the confrontation parameter of each frame of reference game picture.
Optionally, in some embodiments of the present application, the clipping unit 304 may include a dividing subunit 3041, a second fusing subunit 3042, and a determining subunit 3043, see fig. 3d, as follows:
the dividing unit 3041 is configured to divide the original game video into a plurality of video segments according to a preset time interval;
a second fusion subunit 3042, configured to fuse the confrontation parameters of the reference game pictures in each video clip to obtain post-fusion confrontation parameters of the video clips;
a determining subunit 3043, configured to determine a target game video segment from the plurality of video segments based on the post-fusion confrontation parameter.
As can be seen from the above, in the present embodiment, the obtaining unit 301 may obtain an original game video, and extract a plurality of reference game pictures from the original game video; for each frame of reference game picture, identifying, by the identifying unit 302, position information of a game character in the reference game picture in a game scene and game opponent information to which the game character belongs; for each frame of reference game picture, calculating a confrontation parameter of the reference game picture by the calculating unit 303 based on the number of game characters in the game scene and the distance between the game characters when the game confrontation parties confront with each other, and obtaining the confrontation parameter of each frame of reference game picture, wherein the confrontation parameter represents the confrontation severity of each game confrontation party in each frame of reference game picture; based on the confrontation parameter of each frame of reference game picture, a target game video clip is cut out from the original game video by the cutting unit 304. The method and the device can utilize the characteristics of the game countermeasures, and calculate the countermeasures intensity of each game countermeasure based on the distance between the game roles in different game countermeasures and the number of the game roles during the countermeasures so as to determine the wonderful game segment, thereby greatly improving the accuracy and speed of editing the wonderful game segment.
An electronic device according to an embodiment of the present application is further provided, as shown in fig. 4, which shows a schematic structural diagram of the electronic device according to an embodiment of the present application, specifically:
the electronic device may include components such as a processor 401 of one or more processing cores, memory 402 of one or more computer-readable storage media, a power supply 403, and an input unit 404. Those skilled in the art will appreciate that the electronic device configuration shown in fig. 4 does not constitute a limitation of the electronic device and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components. Wherein:
the processor 401 is a control center of the electronic device, connects various parts of the whole electronic device by various interfaces and lines, performs various functions of the electronic device and processes data by running or executing software programs and/or modules stored in the memory 402 and calling data stored in the memory 402, thereby performing overall monitoring of the electronic device. Optionally, processor 401 may include one or more processing cores; preferably, the processor 401 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 401.
The memory 402 may be used to store software programs and modules, and the processor 401 executes various functional applications and data processing by operating the software programs and modules stored in the memory 402. The memory 402 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data created according to use of the electronic device, and the like. Further, the memory 402 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Accordingly, the memory 402 may also include a memory controller to provide the processor 401 access to the memory 402.
The electronic device further comprises a power supply 403 for supplying power to the various components, and preferably, the power supply 403 is logically connected to the processor 401 through a power management system, so that functions of managing charging, discharging, and power consumption are realized through the power management system. The power supply 403 may also include any component of one or more dc or ac power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, and the like.
The electronic device may further include an input unit 404, and the input unit 404 may be used to receive input numeric or character information and generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function control.
Although not shown, the electronic device may further include a display unit and the like, which are not described in detail herein. Specifically, in this embodiment, the processor 401 in the electronic device loads the executable file corresponding to the process of one or more application programs into the memory 402 according to the following instructions, and the processor 401 runs the application program stored in the memory 402, thereby implementing various functions as follows:
acquiring an original game video, and extracting a plurality of frames of reference game pictures from the original game video; aiming at each frame of reference game picture, identifying the position information of the game role in the game scene in the reference game picture and the game opponent information of the game role; aiming at each frame of reference game picture, calculating a confrontation parameter of the reference game picture based on the number of game characters in the game scene and the distance between the game characters when the confrontation parties confront with each other, and obtaining the confrontation parameter of each frame of reference game picture, wherein the confrontation parameter represents the confrontation intensity of each game confrontation party in each frame of reference game picture; based on the confrontation parameters of each frame of reference game picture, a target game video segment is cut out from the original game video.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
As can be seen from the above, the present embodiment may acquire an original game video, and extract a multi-frame reference game picture from the original game video; aiming at each frame of reference game picture, identifying the position information of the game role in the game scene in the reference game picture and the game opponent information of the game role; aiming at each frame of reference game picture, calculating a confrontation parameter of the reference game picture based on the number of game characters in the game scene and the distance between the game characters when the confrontation parties confront with each other, and obtaining the confrontation parameter of each frame of reference game picture, wherein the confrontation parameter represents the confrontation intensity of each game confrontation party in each frame of reference game picture; based on the confrontation parameters of each frame of reference game picture, a target game video segment is cut out from the original game video. The method and the device can utilize the characteristics of the game countermeasures, and calculate the countermeasures intensity of each game countermeasure based on the distance between the game roles in different game countermeasures and the number of the game roles during the countermeasures so as to determine the wonderful game segment, thereby greatly improving the accuracy and speed of editing the wonderful game segment.
It will be understood by those skilled in the art that all or part of the steps of the methods of the above embodiments may be performed by instructions or by associated hardware controlled by the instructions, which may be stored in a computer readable storage medium and loaded and executed by a processor.
To this end, embodiments of the present application provide a storage medium having stored therein a plurality of instructions, which can be loaded by a processor to perform the steps of any one of the game video clipping methods provided by embodiments of the present application. For example, the instructions may perform the steps of:
acquiring an original game video, and extracting a plurality of frames of reference game pictures from the original game video; aiming at each frame of reference game picture, identifying the position information of the game role in the game scene in the reference game picture and the game opponent information of the game role; aiming at each frame of reference game picture, calculating a confrontation parameter of the reference game picture based on the number of game characters in the game scene and the distance between the game characters when the confrontation parties confront with each other, and obtaining the confrontation parameter of each frame of reference game picture, wherein the confrontation parameter represents the confrontation intensity of each game confrontation party in each frame of reference game picture; based on the confrontation parameters of each frame of reference game picture, a target game video segment is cut out from the original game video.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Wherein the storage medium may include: read Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disks, and the like.
Since the instructions stored in the storage medium can execute the steps in any game video editing method provided in the embodiments of the present application, the beneficial effects that can be achieved by any game video editing method provided in the embodiments of the present application can be achieved, which are detailed in the foregoing embodiments and will not be described herein again.
The foregoing describes in detail a game video editing method, apparatus, electronic device and storage medium provided in the embodiments of the present application, and specific examples are applied herein to explain the principles and implementations of the present application, and the description of the foregoing embodiments is only used to help understand the method and core ideas of the present application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.
Claims (10)
1. A game video clipping method, comprising:
acquiring an original game video, and extracting a plurality of frames of reference game pictures from the original game video;
aiming at each frame of reference game picture, identifying the position information of the game role in the game scene in the reference game picture and the game opponent information of the game role;
aiming at each frame of reference game picture, calculating a confrontation parameter of the reference game picture based on the number of game characters in the game scene and the distance between the game characters when the confrontation parties confront with each other, and obtaining the confrontation parameter of each frame of reference game picture, wherein the confrontation parameter represents the confrontation intensity of each game confrontation party in each frame of reference game picture;
based on the confrontation parameters of each frame of reference game picture, a target game video segment is cut out from the original game video.
2. The method of claim 1, wherein for each frame of reference game picture, identifying position information of a game character in the reference game picture in a game scene and game party information to which the game character belongs comprises:
aiming at each frame of reference game picture, acquiring a preset life value indication template of a game role in a game scene;
identifying position information of a game role in a game scene in a reference game picture based on the sliding preset life value indication template;
and identifying the game counter party to which the game role in the reference game picture belongs to obtain the information of the game counter party to which the game role in the reference game picture belongs.
3. The method of claim 2, wherein the identifying the game party to which the game character belongs in the reference game screen to obtain the game party information to which the game character belongs in the reference game screen comprises:
acquiring color information of a life value indication template of a game character in the reference game picture;
and determining game counter party information to which the game character belongs in the reference game picture based on the color information of the life value indication template.
4. The method of claim 2, wherein the identifying the position information of the game character in the game scene in the reference game picture based on the sliding preset life value indication template comprises:
identifying the position of a life value indication template of a game character in a reference game picture based on the sliding of a preset life value indication template on the reference game picture;
and acquiring the position information of the game role in the game scene in the reference game picture based on the position of the life value indication template.
5. The method of claim 4, wherein identifying the position of the life value indicating template of the game character in the reference game picture based on the sliding of the preset life value indicating template on the reference game picture comprises:
based on the sliding of a preset life value indication template on a reference game picture, calculating the gray level similarity of an overlapped area of the preset life value indication template and the reference game picture after the sliding;
and determining the position of the life value indication template of the game character in the reference game picture based on the gray similarity.
6. The method of claim 5, wherein determining the position of the life value indication template of the game character in the reference game picture based on the gray scale similarity comprises:
generating a matching graph corresponding to the reference game picture based on the gray level similarity, wherein the matching graph comprises a plurality of pixel points, and the values of the pixel points represent the gray level similarity of a preset life value indication template in the overlapping area of the pixel points and the reference game picture;
determining pixel points with the gray similarity higher than a preset similarity in the matching image as matching pixel points;
and determining the position of the life value indication template of the game role in the reference game picture based on the positions of the matched pixel points in the matching graph corresponding to the reference game picture.
7. The method of claim 1, wherein calculating the confrontation parameter of the reference game picture based on the number of game characters in the game scene and the distance between the game characters when the game confrontation party confronts each other for each frame of the reference game picture comprises:
aiming at each frame of reference game picture, obtaining the distance between the game roles when different game counterweights play;
calculating local confrontation parameters of game characters in the game confrontation parties based on the distance;
and obtaining the confrontation parameters of the reference game picture by fusing the local confrontation parameters of each game role in each game confrontation party.
8. The method of claim 7, wherein calculating the local confrontational parameters for the game character in each game confrontation based on the distance comprises:
and for each game role in each game opponent, carrying out weighting processing on the distance from the game role in the game opponent to each game role in other game opponents to obtain the local confrontation parameters of the game role in the game opponent and obtain the local confrontation parameters of the game role in each game opponent.
9. The method of claim 1, wherein the cutting out a target game video segment from the original game video based on the confrontational parameter of each frame of reference game picture comprises:
dividing the original game video into a plurality of video segments according to a preset time interval;
fusing the confrontation parameters of the reference game pictures in each video clip to obtain the confrontation parameters after the fusion of the video clips;
and determining a target game video clip from a plurality of video clips based on the fused confrontation parameter.
10. A game video clip apparatus, comprising:
the device comprises an acquisition unit, a processing unit and a display unit, wherein the acquisition unit is used for acquiring an original game video and extracting a plurality of frames of reference game pictures from the original game video;
the identification unit is used for identifying the position information of the game role in the game scene in the reference game picture and the game counter party information of the game role in each frame of reference game picture;
the computing unit is used for computing a confrontation parameter of each frame of reference game picture according to the number of game characters in the game scene and the distance between the game characters when the confrontation parties confront with each other, and the confrontation parameter represents the confrontation intensity of each game confrontation party in each frame of reference game picture;
and the cutting unit is used for cutting out a target game video clip from the original game video based on the confrontation parameter of each frame of reference game picture.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010302098.1A CN111491179B (en) | 2020-04-16 | 2020-04-16 | Game video editing method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010302098.1A CN111491179B (en) | 2020-04-16 | 2020-04-16 | Game video editing method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111491179A true CN111491179A (en) | 2020-08-04 |
CN111491179B CN111491179B (en) | 2023-07-14 |
Family
ID=71812832
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010302098.1A Active CN111491179B (en) | 2020-04-16 | 2020-04-16 | Game video editing method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111491179B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112188268A (en) * | 2020-09-25 | 2021-01-05 | 腾讯科技(深圳)有限公司 | Virtual scene display method, virtual scene introduction video generation method and device |
CN117499701A (en) * | 2023-12-29 | 2024-02-02 | 景色智慧(北京)信息科技有限公司 | Method and device for realizing riding game lens close-up and electronic equipment |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110195779A1 (en) * | 2010-02-05 | 2011-08-11 | Pc Concepts Limited | Methods and apparatuses for constructing interactive video games by use of video clip |
CN108234825A (en) * | 2018-01-12 | 2018-06-29 | 广州市百果园信息技术有限公司 | Method for processing video frequency and computer storage media, terminal |
CN108259990A (en) * | 2018-01-26 | 2018-07-06 | 腾讯科技(深圳)有限公司 | A kind of method and device of video clipping |
CN109672922A (en) * | 2017-10-17 | 2019-04-23 | 腾讯科技(深圳)有限公司 | A kind of game video clipping method and device |
CN110019951A (en) * | 2017-09-29 | 2019-07-16 | 华为软件技术有限公司 | A kind of method and apparatus generating video thumbnails |
-
2020
- 2020-04-16 CN CN202010302098.1A patent/CN111491179B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110195779A1 (en) * | 2010-02-05 | 2011-08-11 | Pc Concepts Limited | Methods and apparatuses for constructing interactive video games by use of video clip |
CN110019951A (en) * | 2017-09-29 | 2019-07-16 | 华为软件技术有限公司 | A kind of method and apparatus generating video thumbnails |
CN109672922A (en) * | 2017-10-17 | 2019-04-23 | 腾讯科技(深圳)有限公司 | A kind of game video clipping method and device |
CN108234825A (en) * | 2018-01-12 | 2018-06-29 | 广州市百果园信息技术有限公司 | Method for processing video frequency and computer storage media, terminal |
CN108259990A (en) * | 2018-01-26 | 2018-07-06 | 腾讯科技(深圳)有限公司 | A kind of method and device of video clipping |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112188268A (en) * | 2020-09-25 | 2021-01-05 | 腾讯科技(深圳)有限公司 | Virtual scene display method, virtual scene introduction video generation method and device |
CN117499701A (en) * | 2023-12-29 | 2024-02-02 | 景色智慧(北京)信息科技有限公司 | Method and device for realizing riding game lens close-up and electronic equipment |
CN117499701B (en) * | 2023-12-29 | 2024-03-12 | 景色智慧(北京)信息科技有限公司 | Method and device for realizing riding game lens close-up and electronic equipment |
Also Published As
Publication number | Publication date |
---|---|
CN111491179B (en) | 2023-07-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111491173B (en) | Live cover determination method and device, computer equipment and storage medium | |
CN109522815B (en) | Concentration degree evaluation method and device and electronic equipment | |
CN111013150B (en) | Game video editing method, device, equipment and storage medium | |
US20090202114A1 (en) | Live-Action Image Capture | |
CN112232258B (en) | Information processing method, device and computer readable storage medium | |
CN112272295B (en) | Method for generating video with three-dimensional effect, method for playing video, device and equipment | |
Szwoch | FEEDB: a multimodal database of facial expressions and emotions | |
CN111035933B (en) | Abnormal game detection method and device, electronic equipment and readable storage medium | |
CN114374882B (en) | Barrage information processing method, barrage information processing device, barrage information processing terminal and computer readable storage medium | |
CN113497946B (en) | Video processing method, device, electronic equipment and storage medium | |
CN111643900A (en) | Display picture control method and device, electronic equipment and storage medium | |
CN113453034A (en) | Data display method and device, electronic equipment and computer readable storage medium | |
CN111491179A (en) | Game video editing method and device | |
CN112150349A (en) | Image processing method and device, computer equipment and storage medium | |
CN109670385A (en) | The method and device that expression updates in a kind of application program | |
CN111768478A (en) | Image synthesis method and device, storage medium and electronic equipment | |
CN112287848A (en) | Live broadcast-based image processing method and device, electronic equipment and storage medium | |
CN111191542B (en) | Method, device, medium and electronic equipment for identifying abnormal actions in virtual scene | |
CN113392690A (en) | Video semantic annotation method, device, equipment and storage medium | |
CN114095742A (en) | Video recommendation method and device, computer equipment and storage medium | |
CN117032520A (en) | Video playing method and device based on digital person, electronic equipment and storage medium | |
CN111783587A (en) | Interaction method, device and storage medium | |
CN112131426B (en) | Game teaching video recommendation method and device, electronic equipment and storage medium | |
CN115272057A (en) | Training of cartoon sketch image reconstruction network and reconstruction method and equipment thereof | |
CN111008577A (en) | Virtual face-based scoring method, system, device and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |