CN111491179B - Game video editing method and device - Google Patents

Game video editing method and device Download PDF

Info

Publication number
CN111491179B
CN111491179B CN202010302098.1A CN202010302098A CN111491179B CN 111491179 B CN111491179 B CN 111491179B CN 202010302098 A CN202010302098 A CN 202010302098A CN 111491179 B CN111491179 B CN 111491179B
Authority
CN
China
Prior art keywords
game
countermeasure
video
picture
frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010302098.1A
Other languages
Chinese (zh)
Other versions
CN111491179A (en
Inventor
余自强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202010302098.1A priority Critical patent/CN111491179B/en
Publication of CN111491179A publication Critical patent/CN111491179A/en
Application granted granted Critical
Publication of CN111491179B publication Critical patent/CN111491179B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • H04N21/23418Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/44008Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream

Abstract

The application discloses a game video editing method and device; the method and the device can acquire the original game video and extract multi-frame reference game pictures from the original game video; identifying, for each frame of the reference game picture, position information of a game character in a game scene in the reference game picture, and game opponent information to which the game character belongs; calculating a countermeasure parameter of each frame of reference game picture based on the number of game characters in the game scene and the distance between the game characters when the game countermeasure party is countermeasure; a target game video clip is cut from the original game video based on the countermeasure parameters of each frame reference game picture. According to the method and the device, the characteristics of game countermeasure can be utilized, the countermeasure intensity degree of each game countermeasure party is calculated based on the distances among game roles in different game countermeasure parties and the number of the game roles in the countermeasure, so that the highlight game fragments are determined, and the accuracy and the speed of the highlight game fragment clipping are greatly improved.

Description

Game video editing method and device
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method and apparatus for editing a game video.
Background
With the continuous improvement of the living standard of people, the entertainment demands of people are also higher and higher, and many people like to play the electronic games in spare time. Users often have a need to share their own game video of interest for others to view during game play. However, for some games, the time of a game is too long, which may be more than half an hour, and it is inconvenient to share the game with others, so that the highlight in the game video needs to be clipped, cut into a video with a shorter duration, and then shared with others.
In the prior related art, a manual editing mode is generally adopted to cut highlight clips in game videos, so that the editing workload is large; there are also game clips that highlight a game by detecting game events in a game video, where the game events may include clicking, assisting, highlighting text, etc., but the manner of clipping based on the game events is too monotonous and inaccurate. The large number of highlights is the process of gaming or countering from the highlights of the parties to the game, during which no shots are often generated, and the highlights cannot be captured based on the way the game event is clipped.
Disclosure of Invention
The embodiment of the application provides a game video editing method and device, which can greatly improve the accuracy and speed of the editing of a wonderful game fragment.
The embodiment of the application provides a game video clipping method, which comprises the following steps:
acquiring an original game video, and extracting multi-frame reference game pictures from the original game video;
identifying, for each frame of the reference game picture, position information of a game character in a game scene in the reference game picture, and game opponent information to which the game character belongs;
calculating the countermeasure parameters of the reference game picture according to the number of game characters in the game scene and the distance between the game characters when the game countermeasure party is countermeasure, and obtaining the countermeasure parameters of each frame of reference game picture, wherein the countermeasure parameters represent the countermeasure intensity of each game countermeasure party in each frame of reference game picture;
a target game video clip is cut from the original game video based on the countermeasure parameters of each frame reference game picture.
Accordingly, an embodiment of the present application provides a game video clip device, including:
the acquisition unit is used for acquiring an original game video and extracting multi-frame reference game pictures from the original game video;
An identification unit configured to identify, for each frame of the reference game screen, position information of a game character in a game scene in the reference game screen, and game opponent information to which the game character belongs;
a calculating unit, configured to calculate, for each frame of reference game frame, a countermeasure parameter of the reference game frame based on the number of game characters in the game scene and a distance between the game characters when the game countermeasure is countered, to obtain a countermeasure parameter of each frame of reference game frame, where the countermeasure parameter characterizes a countermeasure intensity of each game countermeasure in each frame of reference game frame;
and the cutting unit is used for cutting out target game video fragments from the original game video based on the countermeasure parameters of each frame of reference game picture.
Optionally, in some embodiments of the present application, the identifying unit may include a first acquiring subunit, a first identifying subunit, and a second identifying subunit, as follows:
the first acquisition subunit is used for referencing a game picture for each frame and acquiring a preset life value indication template of a game role in a game scene;
the first identification subunit is used for identifying the position information of the game role in the game scene in the reference game picture based on the sliding preset life value indication template;
And the second identification subunit is used for identifying the game opponent to which the game role belongs in the reference game picture, and obtaining the game opponent information to which the game role belongs in the reference game picture.
Optionally, in some embodiments of the present application, the second identifying subunit may be specifically configured to obtain color information of a vital value indication template of a game character in the reference game screen; and determining game opponent information to which the game character belongs in the reference game picture based on the color information of the life value indication template.
Optionally, in some embodiments of the present application, the first identifying subunit may specifically be configured to identify, based on the sliding of the preset vital value indication template on the reference game screen, a location of the vital value indication template of the game character in the reference game screen; and acquiring the position information of the game role in the game scene in the reference game picture based on the position of the life value indication template.
Optionally, in some embodiments, the step of identifying the location of the vital value indication template of the game character in the reference game picture based on the sliding of the preset vital value indication template on the reference game picture may specifically include:
Calculating gray scale similarity of an overlapping area of the preset life value indication template and the reference game picture after sliding based on sliding of the preset life value indication template on the reference game picture;
based on the gray level similarity, determining a position of a life value indication template of a game character in the reference game screen.
Wherein, optionally, in some embodiments, the step of determining, based on the gray level similarity, a location of a life value indication template of a game character in the reference game screen may include:
generating a matching diagram corresponding to the reference game picture based on the gray level similarity, wherein the matching diagram comprises a plurality of pixel points, and the value of each pixel point represents the gray level similarity of a preset life value indication template in an overlapping area of the pixel point and the reference game picture;
determining pixel points with gray level similarity higher than preset similarity in the matching graph as matching pixel points;
and determining the position of a life value indication template of the game role in the reference game picture based on the position of the matched pixel point in the matched picture corresponding to the reference game picture.
Optionally, in some embodiments of the present application, the computing unit may include a second acquisition subunit, a computing subunit, and a first fusion subunit, as follows:
The second obtaining subunit is used for referring to the game picture for each frame and obtaining the distances between game characters when different game opponents fight;
a calculating subunit for calculating local countermeasure parameters of the game characters in the respective game countermeasure parties based on the distances;
and the first fusion subunit is used for obtaining the countermeasure parameters of the reference game picture by fusing the local countermeasure parameters of each game role in each game countermeasure party.
Optionally, in some embodiments, the calculating subunit may be specifically configured to, for each game role in each game counterpoise, perform a weighting process on a distance between the game role in the game counterpoise and each game role in other game counterpoise, to obtain a local challenge parameter of the game role in the game counterpoise, and obtain a local challenge parameter of the game role in each game counterpoise.
Optionally, in some embodiments of the present application, the clipping unit may include a scoring subunit, a second fusion subunit, and a determining subunit, as follows:
the dividing subunit is used for dividing the original game video into a plurality of video clips according to a preset time interval;
The second fusion subunit is used for fusing the countermeasure parameters of the reference game picture in each video segment to obtain the fused countermeasure parameters of the video segment;
and the determining subunit is used for determining the target game video clips from the plurality of video clips based on the post-fusion countermeasure parameters.
An electronic device provided in an embodiment of the present application includes a processor and a memory, where the memory stores a plurality of instructions, and the processor loads the instructions to execute steps in the game video clipping method provided in the embodiment of the present application.
In addition, the embodiment of the application further provides a storage medium, on which a computer program is stored, wherein the computer program, when executed by a processor, implements the steps in the game video clipping method provided in the embodiment of the application.
The embodiment of the application provides a game video editing method and device, which can acquire an original game video and extract multi-frame reference game pictures from the original game video; identifying, for each frame of the reference game picture, position information of a game character in a game scene in the reference game picture, and game opponent information to which the game character belongs; calculating the countermeasure parameters of the reference game picture according to the number of game characters in the game scene and the distance between the game characters when the game countermeasure party is countermeasure, and obtaining the countermeasure parameters of each frame of reference game picture, wherein the countermeasure parameters represent the countermeasure intensity of each game countermeasure party in each frame of reference game picture; a target game video clip is cut from the original game video based on the countermeasure parameters of each frame reference game picture. According to the method and the device, the characteristics of game countermeasure can be utilized, the countermeasure intensity degree of each game countermeasure party is calculated based on the distances among game roles in different game countermeasure parties and the number of the game roles in the countermeasure, so that the highlight game fragments are determined, and the accuracy and the speed of the highlight game fragment clipping are greatly improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1a is a schematic view of a game video clip method provided by an embodiment of the present application;
FIG. 1b is a flow chart of a method of game video editing provided by an embodiment of the present application;
FIG. 1c is a schematic template diagram of a method of video editing of a game provided in an embodiment of the present application;
FIG. 1d is a schematic illustration of a game video clip method provided by an embodiment of the present application;
FIG. 1e is another schematic illustration of a game video clip method provided by an embodiment of the present application;
FIG. 1f is a matching diagram of a method of video editing of a game provided by an embodiment of the present application;
FIG. 1g is a processed matching graph of a gaming video clip method provided by an embodiment of the present application;
FIG. 1h is another schematic illustration of a game video clip method provided by an embodiment of the present application;
FIG. 1i is another schematic illustration of a game video clip method provided by an embodiment of the present application;
FIG. 1j is another schematic illustration of a game video clip method provided by an embodiment of the present application;
FIG. 1k is another schematic illustration of a game video clip method provided by an embodiment of the present application;
FIG. 1l is a challenge parameter profile for a game video clip method provided by embodiments of the present application;
FIG. 2 is another flow chart of a method of game video editing provided by an embodiment of the present application;
FIG. 3a is a schematic diagram of a game video clip device according to an embodiment of the present application;
FIG. 3b is another schematic diagram of a game video clip device provided in an embodiment of the present application;
FIG. 3c is another schematic diagram of a game video clip device provided in an embodiment of the present application;
FIG. 3d is another schematic diagram of a game video clip device provided in an embodiment of the present application;
fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by those skilled in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
The embodiment of the application provides a game video clipping method and device. Specifically, the embodiment of the application provides a game video clipping device suitable for electronic equipment, and the electronic equipment can be a terminal or a server and other equipment.
It will be appreciated that the game video clip method of the present embodiment may be executed on the terminal, may be executed on the server, or may be executed by both the terminal and the server.
Referring to fig. 1a, an example is given in which a terminal and a server jointly execute a game video clip method. The game video clip system provided by the embodiment of the application comprises a terminal 10, a server 11 and the like; the terminal 10 and the server 11 are connected via a network, e.g. a wired or wireless network connection, etc., wherein the game video clip means may be integrated in the server.
The terminal 10 may record an original game video, send a cut instruction of a highlight clip and the original game video to the server 11, so that the server 11 extracts multi-frame reference game frames of the original game video, calculates a countermeasure parameter for each frame of reference game frame, cuts the highlight clip from the original game video based on the countermeasure parameter of each frame of reference game frame, and returns the highlight clip to the terminal 10. The terminal 10 may include a game machine, a mobile phone, a smart tv, a tablet computer, a notebook computer, or a personal computer (PC, personal Computer), among others.
The server 11 may be configured to: acquiring an original game video, namely a video requiring cutting of a highlight game fragment, and extracting multi-frame reference game pictures from the original game video; identifying, for each frame of the reference game picture, position information of a game character in a game scene in the reference game picture, and game opponent information to which the game character belongs; calculating the countermeasure parameters of the reference game picture according to the number of game characters in the game scene and the distance between the game characters when the game countermeasure party is countermeasure, and obtaining the countermeasure parameters of each frame of reference game picture, wherein the countermeasure parameters represent the countermeasure intensity of each game countermeasure party in each frame of reference game picture; cutting out a target game video clip from the original game video based on the countermeasure parameter of each frame reference game picture, wherein the target game video clip is a wonderful game clip of the original game video; the highlight reel is then transmitted to the terminal 10. The server 11 may be a single server or a server cluster composed of a plurality of servers.
The above-described procedure of the server 11 editing the target game video clip may also be performed by the terminal 10.
The game video editing method provided by the embodiment of the application relates to a Computer Vision technology (CV) in the field of artificial intelligence (AI, artificial Intellegence). According to the embodiment of the application, the characteristics of the game countermeasure can be utilized, the countermeasure intensity degree of each game countermeasure party is calculated based on the distances among the game roles in different game countermeasure parties and the number of the game roles in the countermeasure, so that the highlight game fragments are determined, and the accuracy and speed of the highlight game fragment clipping are greatly improved.
Among these, artificial intelligence (AI, artificial Intelligence) is the theory, method, technique and application system that uses a digital computer or a digital computer-controlled machine to simulate, extend and expand human intelligence, sense the environment, acquire knowledge and use knowledge to obtain optimal results. In other words, artificial intelligence is an integrated technology of computer science that attempts to understand the essence of intelligence and to produce a new intelligent machine that can react in a similar way to human intelligence. Artificial intelligence, i.e. research on design principles and implementation methods of various intelligent machines, enables the machines to have functions of sensing, reasoning and decision. The artificial intelligence technology is a comprehensive subject, and relates to the technology with wide fields, namely the technology with a hardware level and the technology with a software level. The artificial intelligence software technology mainly comprises a computer vision technology, a voice processing technology, a natural language processing technology, machine learning/deep learning and other directions.
The Computer Vision technology (CV) is a science for researching how to make a machine "look at", and more specifically, a camera and a Computer are used to replace human eyes to perform machine Vision such as identifying and measuring on a target, and further perform graphic processing, so that the Computer is processed into an image more suitable for human eyes to observe or transmit to an instrument to detect. As a scientific discipline, computer vision research-related theory and technology has attempted to build artificial intelligence systems that can acquire information from images or multidimensional data. Computer vision techniques typically include image processing, image recognition, image semantic understanding, image retrieval, OCR, video processing, video semantic understanding, video content/behavior recognition, three-dimensional object reconstruction, 3D techniques, virtual reality, augmented reality, synchronous positioning, and map construction, among others, as well as common biometric recognition techniques such as face recognition, fingerprint recognition, and others.
The following will describe in detail. The following description of the embodiments is not intended to limit the preferred embodiments.
The embodiments of the present application will be described in terms of a game video editing apparatus, which may be specifically integrated in an electronic device, where the electronic device may be a server, or may be a device such as a terminal.
The game video clipping method can be applied to various scenes needing to clip game highlight video clips, for example, when a user plays a game, the user wants to share the game video played by the user with other people, but the game video clipping method is not suitable for sharing the whole game video to other people due to the relatively long time length of the game video. The game video editing method provided by the embodiment can be used for editing the game video to obtain the wonderful game video clips, the wonderful clips of the game video can be cut more accurately by the method, the editing speed is high, the long game video can be automatically edited to generate the editing video only comprising the wonderful clips, and a large amount of manual editing work is reduced.
As shown in fig. 1b, the specific flow of the game video clip method is as follows, and the game video clip method may be executed by a server or a terminal, which is not limited in this embodiment.
101. An original game video is acquired, and a plurality of frames of reference game pictures are extracted from the original game video.
In this embodiment, the original game video is a game video to be cut, that is, a video to be cut, and the target game video clip is obtained by cutting the original game video, where the target game video clip may be a highlight clip of the original game video. The Game types of the original Game video are not limited, and the Game scene can be a fight Game scene, for example, a massive multiplayer online role playing Game (Massive Multiplayer Online Role Playing Game, referred to as MORPG for short), a Sports Game (SPG for short) scene, a multiplayer online competitive Game (Multiplayer Online Battle Arena Games, referred to as MOBA for short) scene, and the like, which are not limited in this embodiment.
There are various ways to obtain the original game video.
For example, the original game video may be acquired by a video recording device on the electronic device, for example, when a game video recording request instruction is received, the video recording device is started to record the game video, and the recorded game video is taken as the original game video.
For example, the original game video may also be obtained from a database local to the electronic device, for example, the original game video is stored in the database local to the electronic device, and when a cut instruction for a highlight game segment of the original game video is received, the original game video may be directly obtained from the database local to the electronic device, where local refers to the electronic device.
For example, the original game video may also be obtained via the internet and provided to the game video editing device, for example, via internet download.
For example, the original game video may also be obtained by other devices and further provided to the game video clipping apparatus, i.e. the game video clipping apparatus may specifically receive the original game video transmitted by other devices such as other terminals.
For video, the video content consists of a series of video frames, typically expressed as transmission frames per second (FPS, frames Per Second), each frame of video frames being a still image, when the video frames are played in sequence, a moving image can be created. For example, video content created using a 30 FPS means that there are 30 "still images" per second to be played as video.
In this embodiment, the original game video includes a plurality of frames of video frames, where the video frames are specifically game frames, and a portion of the video frames may be extracted from the original game video as reference game frames according to a preset time interval, where the preset time interval may be specifically set according to an actual situation. For example, the original game video has 35 frames per second, and the extraction can be performed at intervals of 5 frames, i.e., 7 frames per second, as the reference game picture.
102. For each frame of the reference game picture, position information of a game character in the reference game picture in the game scene and game opponent information to which the game character belongs are identified.
The game characters are virtual characters in a game scene, and can be virtual characters, virtual animals or the like. The game opponent refers to a game camp, in general, game characters are divided into different camps, such as a red square and a blue square, and the number of game camps is usually two or more, and each game camp comprises at least one game character. The antagonism between different game camps generally represents a highlight of the game. In the target game scene, a plurality of game characters of the same game camp are in a mutual cooperative relationship, and the game characters of different game camps are in a mutual competing relationship.
In a game scene, visual elements attracting game players are generally constructed, and the positions of game characters can be found by the elements, or the parts with the minimum change of targets to be detected can be found by the elements. For example, a game character is typically provided with a life value indicating template above, which indicates a battle value or a life value of the game character, and when the battle value or the life value is lower than a certain value, the game character corresponding to the life value indicating template exits the battle. In particular, the vital value indicating template may be a blood strip, i.e. a blood strip with a fixed shape on top of the game character head. Since the vital value indicating template is above the game character, the position of the game character may be determined based on the position of the vital value indicating template.
Optionally, in some embodiments, the step of "identifying, for each frame of the reference game screen, the position information of the game character in the game scene in the reference game screen, and the game opponent information to which the game character belongs" may include:
aiming at each frame of reference game picture, obtaining a preset life value indication template of a game role in a game scene;
Identifying position information of a game character in a game scene in a reference game picture based on a sliding preset life value indication template;
and identifying the game opponent to which the game role belongs in the reference game picture to obtain the game opponent information to which the game role belongs in the reference game picture.
Wherein, in some embodiments, the step of identifying the position information of the game character in the game scene in the reference game screen based on the sliding preset life value indication template may include:
identifying the position of a life value indication template of a game role in a reference game picture based on the sliding of the preset life value indication template on the reference game picture;
and acquiring the position information of the game role in the game scene in the reference game picture based on the position of the life value indication template.
The preset life value indication template can be a blood strip template. The top of the head of the game role is provided with a life value indication template, namely a blood bar, the blood bar represents the life value or the combat value of the corresponding game role, and the appearance outline of the blood bar corresponding to each game role is the same, so that the positions of the life value indication templates corresponding to all the game roles, namely the positions of the corresponding blood bars, can be found in a game picture by a template matching method, and the positions of the game roles corresponding to the blood bars in a reference game picture can be positioned according to the positions of the blood bars.
The template matching is a technology for searching a part which is most matched (similar) with another template image in one image, and the template matching technology is applied, specifically, the part which is most matched (similar) with the blood strip template in the reference game image can be searched based on the sliding of the blood strip template on the reference game image, wherein the part is the position of the blood strip corresponding to the game role in the reference game image, and the position below the blood strip position is the position corresponding to the game role in the game scene.
The preset life value indication template, namely the blood strip template, can be set according to actual conditions, and is not limited in this embodiment. For example, since the contents such as the blood volume, color, and cooling time in the blood strip are continuously changed, a part (the same part) of the blood strip which is not changed can be set as a blood strip template, template matching can be performed, and a part which is common to the blood strips of each game character in different cases can be taken as a blood strip template, so that the blood strips of the game character in each case can be recognized as much as possible. As shown in FIG. 1c, the blood volume is changed in the game process, and only one-grid blood volume and a digital grade square frame in front of the blood bar are taken as the blood bar template, so that the feature of detecting the blood bar shape of the game character in the reference game picture can be ensured, and the situation that the blood volume of the blood bar is different at each moment is considered. In addition, the blood strip template can meet the matching of blood strips in various special scenes, for example, as shown in fig. 1d, in the case that the blood strips are cut off in the edge area in the game scene, the blood strip position of the game character can still be detected through the blood strip template, because the blood strip template only matches a part of the blood strips; other situations such as multiple game character blood sticks being folded may be identified and are not further described herein.
Optionally, in some embodiments, the step of identifying the location of the vital value indication template of the game character in the reference game picture based on the sliding of the preset vital value indication template on the reference game picture may specifically include:
calculating gray scale similarity of an overlapping area of the preset life value indication template and the reference game picture after sliding based on sliding of the preset life value indication template on the reference game picture;
based on the gray level similarity, determining a position of a life value indication template of a game character in the reference game screen.
Wherein, optionally, in some embodiments, the step of determining, based on the gray level similarity, a location of a life value indication template of a game character in the reference game screen may include:
generating a matching diagram corresponding to the reference game picture based on the gray level similarity, wherein the matching diagram comprises a plurality of pixel points, and the value of each pixel point represents the gray level similarity of a preset life value indication template in an overlapping area of the pixel point and the reference game picture;
determining pixel points with gray level similarity higher than preset similarity in the matching graph as matching pixel points;
And determining the position of a life value indication template of the game role in the reference game picture based on the position of the matched pixel point in the matched picture corresponding to the reference game picture.
In some embodiments, the blood bars of the game character have different colors, for example, different game opponents adopt different blood bar colors, namely, red blood bars and blue blood bars, before the step of calculating the gray level similarity of the preset life value indicating template and the overlapping area of the reference game picture after sliding based on the sliding of the preset life value indicating template on the reference game picture, gray level processing is further needed for the reference game picture, and the reference game picture is converted from a color image to a gray level image and then used for template matching.
The matching chart is specifically a mapping of gray level similarity of an overlapping area of the preset life value indication template and the reference game picture, and the matching chart and the corresponding reference game picture have the same size. The higher the gray level similarity is, the higher the matching degree of the preset life value template at the pixel point is, namely the higher the probability that the pixel point is the position where the blood stripe is located is.
As shown in fig. 1e and 1f, fig. 1f is a matching diagram generated by template matching in fig. 1 e. The position pointed by the arrow is the position of the blood bar corresponding to the game character. The pixel values in the matching graph represent the matching degree of the blood strip template at the point and the reference game picture, the generated matching graph can be subjected to threshold processing, the pixel points with the gray level similarity higher than the preset similarity in the matching graph are determined to be the matching pixel points, the magnitude of the preset similarity can be processed according to practical conditions, the embodiment is not limited to this, and the effect graph subjected to threshold processing is shown in the following figure 1g by taking 0.5 as a threshold according to the still experience of a developer. After thresholding, only a few small white spots (the directions indicated by arrows) exist in fig. 1g, the small white spots are the matched pixel points, and a plurality of matched pixel points exist at the position of each blood strip after thresholding, so that the matched pixel points with the close positions also need to be combined, and each region takes one of the matched pixel points as the starting point of the blood strip position.
Specifically, the matching of the blood strip templates can be conveniently and quickly completed by matching through a standard correlation matching (TM_CCOEFF_NORMED) mode of a Template matching function (Match Template) of an open source computer vision library (OpenCV, open Source Computer Vision Library). The OpenCV is a cross-platform computer vision library that allows open source release based on BSD (Berkeley Software Distribution, berkeley software suite), and can run on Linux, windows, android and Mac OS operating systems. The tm_ccoff_normal is a normalized correlation coefficient matching method, where a positive value indicates that the matching result is good, and a negative value indicates that the matching effect is poor, and the larger the value is, the better the matching effect is. The Template matching function Match Template sequentially calculates the similarity of the blood strip Template and the overlapping area of the reference game picture in terms of gray scale, and stores the calculation result into a mapping image-matching diagram, that is, the value of each point in the matching diagram represents a similarity comparison result.
Optionally, in some embodiments, the step of identifying a game opponent to which the game character belongs in the reference game picture, and obtaining the game opponent information to which the game character belongs in the reference game picture may specifically include:
Acquiring color information of a life value indication template of a game character in the reference game picture;
and determining game opponent information to which the game character belongs in the reference game picture based on the color information of the life value indication template.
Wherein the vital value indicating template may be a blood strip. In a game scene, different game opponents are usually distinguished by blood strips with different colors, so that images of blood strip template areas corresponding to game characters (namely positions corresponding to the blood strips) can be extracted, the images are subjected to color averaging processing, and the game opponents are finally determined by judging the color range of the color average value. For example, there are two game opponents, the blood stripe color of the game opponent is red, the blood stripe color of the game opponent is blue, and if the average value of the extracted images of the blood stripe template area of a certain game character is closer to blue, the game character belongs to the game opponent.
103. And calculating the countermeasure parameters of the reference game picture according to the number of game characters in the game scene and the distance between the game characters when the game opponents fight against each other for each frame of reference game picture to obtain the countermeasure parameters of each frame of reference game picture, wherein the countermeasure parameters represent the countermeasure intensity of each game opponent in each frame of reference game picture.
Wherein the challenge parameter, i.e., the combat index, characterizes the severity of the challenge for each game opponent in the reference game frame. In a game scenario, the magnitude of the challenge parameter is related to the number of game characters at the time of the challenge and the distance between the game characters in different game opponents. Referring to fig. 1h, 1i, 1j and 1k, it is understood that the larger the number of game characters, the closer the game characters of different game opponents are, the larger the opponent parameters thereof. Fig. 1h, 1i, 1j and 1k show a game scene of a reference game screen, wherein a square represents a frame where a game character is located, a circle represents the game character, a white circle and a black circle represent different game opponents, a bar above the circle represents a blood bar of the game character, and numerals in front of the blood bar represent levels of the game character. Alternatively, in some embodiments, the factor of the rank of the game character may also be considered in calculating the challenge parameter.
Optionally, in some embodiments, the step of "referring to a game screen for each frame, calculating a countermeasure parameter of the reference game screen based on the number of game characters in the game scene and a distance between game characters when a game countermeasure is performed" may include:
For each frame of reference game picture, obtaining the distance between game characters when different game opponents fight;
calculating local countermeasure parameters of game characters in each game countermeasure party based on the distance;
and fusing the local countermeasure parameters of each game role in each game countermeasure party to obtain the countermeasure parameters of the reference game picture.
Wherein the distance between the game characters can be replaced by the distance between the blood bars corresponding to the game characters. Specifically, the distance between the game characters when different game opponents fight may be the distance of the blood bar starting point corresponding to the game characters.
The method of fusing the local countermeasure parameters may be to add all the local countermeasure parameters in the reference game frame to obtain the countermeasure parameters of the reference game frame. Wherein the local countermeasure parameter of the game character of the game countermeasure side and the game characters of the other game countermeasure side can be calculated based on the distance between the game character of the game countermeasure side and each of the game characters of the other game countermeasure side at the time of countermeasure.
Optionally, in some embodiments, the step of calculating the local countermeasure parameter of the game character in each game countermeasure party based on the distance may specifically include:
And weighting the distances from each game role in each game opponent to each game role in other game opponents according to each game role in each game opponent, so as to obtain the local opponent parameters of the game roles in the game opponents, and obtain the local opponent parameters of the game roles in each game opponent.
Specifically, in a game scene of a reference game picture, there are two game opponents, one is red and one is blue, and the calculation process of the opponent parameters of the reference game picture is as shown in the following formula (1) (2):
Figure GDA0004288185350000141
Figure GDA0004288185350000142
wherein score represents the countermeasure parameter of the reference game picture, d represents the distance between different game opponent game characters, specifically the Euclidean distance between two blood bar starting points, L red Representing the number of game characters in the red side, L blue Representing the number of blue Fang Zhongyou play characters,
Figure GDA0004288185350000143
the abscissa representing the position of the blood bar corresponding to the game character in the red square,/o>
Figure GDA0004288185350000144
Ordinate representing the position of blood bar corresponding to the game character in red square,/o>
Figure GDA0004288185350000145
Abscissa indicating the position of the blue Fang Zhongyou character corresponding to the blood bar, < >>
Figure GDA0004288185350000146
And the ordinate representing the position of the blood bar corresponding to the blue Fang Zhongyou game character, and w and h represent the width and the height of the video frame, namely the reference game picture respectively.
The equation (1) indicates that when the positions of blood strips of different camps are smaller than a certain distance T, the local countermeasure parameter is calculated through the distance, and when the distance between the positions of the blood strips of different camps is not smaller than the distance T, the local countermeasure parameter is 0, wherein the distance T can be set according to actual conditions.
104. A target game video clip is cut from the original game video based on the countermeasure parameters of each frame reference game picture.
In one embodiment, the original game video calculates the challenge parameters at 10 frame intervals, i.e., the reference video frames are extracted at 10 frame intervals, resulting in the challenge parameter distribution map shown in fig. 1 l. Wherein, the abscissa is the video frame number, and the ordinate is the fight parameter, i.e. the fight index. It can be seen that a vigorous match of different game camps occurred between 1100 and 2100 frames, and that both sides performed a walk-drag when the point with an intermediate individual index of 0 (e.g., 1250 frames) was a match, but also belonged to the match segment.
Optionally, in some embodiments, the step of "clipping out the target game video clip from the original game video based on the countermeasure parameter of each frame of the reference game picture" may include:
dividing the original game video into a plurality of video clips according to a preset time interval;
Fusing the countermeasure parameters of the reference game pictures in each video segment to obtain fused countermeasure parameters of the video segments;
and determining a target game video clip from a plurality of video clips based on the post-fusion countermeasure parameters.
The preset time interval may be set according to practical situations, which is not limited in this embodiment, for example, the post-fusion countermeasure parameters in each second may be counted in units of seconds. The fused countermeasure parameters can be calculated at intervals of a certain frame per second, namely, the countermeasure parameters of the reference video pictures in each second can be fused, and the fused countermeasure parameters obtained by 7 frames of reference game pictures are counted per second by taking 35 frames per second and 5 frames per second as an example. The fused countermeasure parameters are calculated in units of frame intervals, so that the influence of 0 on the countermeasure parameters caused by the combat pulling of the two parties in the individual combat segments in fig. 1l can be avoided, and whether the segment in the current second can be judged as the target game video segment or not can be avoided, so that the influence can be reduced by fusing the countermeasure parameters.
The method for fusing the countermeasure parameters of the reference game picture in the video segment is various, for example, an average value of the countermeasure parameters of the reference game picture in the video segment can be calculated, and the average value is used as the fused countermeasure parameters of the video segment.
Optionally, the step of determining the target game video clip from the plurality of video clips based on the post-fusion countermeasure parameter may include:
and determining the video segments with the fused countermeasure parameters larger than the preset value as target game video segments.
The preset value may be set according to actual situations, which is not limited in this embodiment. And the video segments with the fused countermeasure parameters larger than the preset value are used as target game video segments, namely highlight segments of the original game video, and all highlight segments are spliced and combined to obtain the final game highlight gathering video.
Alternatively, in some embodiments, a little time may be reserved before and after the highlight in editing, so that the video may look smoother and more natural.
The embodiment can extract the highlight of the game video and automatically clip and generate the game highlight video. Specifically, the positions of blood bars of all game characters in the game picture are detected through image template matching, and the positions of the game characters in the game picture are determined. And distinguishing the camps where the game characters are located through the colors of the blood bars, calculating the distances of the starting positions of the blood bars corresponding to different camping game characters of the game and the number of the game characters on the game picture of every n frames of the game video, calculating a fight index (fight parameter) by taking the characteristics that the positions of fight parties are close and the game characters are more when the fight parties fight, extracting the wonderful moment in the game video according to the fight index being greater than a certain threshold value, and finally editing all wonderful fragments to generate a final game video. The embodiment can extract the video highlight through the image matching of the blood strips of the game roles based on the characteristics of the game fight event, has high accuracy and high speed, can capture highlight shots such as fight but not generate hit and kill, can automatically clip long videos of the games to generate clip videos only comprising the highlight, and reduces a large number of manual clipping work.
As can be seen from the above, the present embodiment can obtain an original game video, and extract a plurality of frames of reference game frames from the original game video; identifying, for each frame of the reference game picture, position information of a game character in a game scene in the reference game picture, and game opponent information to which the game character belongs; calculating the countermeasure parameters of the reference game picture according to the number of game characters in the game scene and the distance between the game characters when the game countermeasure party is countermeasure, and obtaining the countermeasure parameters of each frame of reference game picture, wherein the countermeasure parameters represent the countermeasure intensity of each game countermeasure party in each frame of reference game picture; a target game video clip is cut from the original game video based on the countermeasure parameters of each frame reference game picture. According to the method and the device, the characteristics of game countermeasure can be utilized, the countermeasure intensity degree of each game countermeasure party is calculated based on the distances among game roles in different game countermeasure parties and the number of the game roles in the countermeasure, so that the highlight game fragments are determined, and the accuracy and the speed of the highlight game fragment clipping are greatly improved.
The method according to the previous embodiment will be described in further detail below with the game video clip device being integrated in the server.
An embodiment of the present application provides a game video editing method, as shown in fig. 2, a specific flow of the game video editing method may be as follows:
201. the server receives the original game video sent by the terminal.
In this embodiment, the original game video is a game video to be cut, that is, a video to be cut, and the target game video clip is obtained by cutting the original game video, where the target game video clip may be a highlight clip of the original game video. The Game types of the original Game video are not limited, and the Game scene can be a fight Game scene, for example, a massive multiplayer online role playing Game (Massive Multiplayer Online Role Playing Game, referred to as MORPG for short), a Sports Game (SPG for short) scene, a multiplayer online competitive Game (Multiplayer Online Battle Arena Games, referred to as MOBA for short) scene, and the like, which are not limited in this embodiment.
202. The server extracts multi-frame reference game pictures from the original game video.
In this embodiment, the original game video includes a plurality of frames of video frames, where the video frames are specifically game frames, and a portion of the video frames may be extracted from the original game video as reference game frames according to a preset time interval, where the preset time interval may be specifically set according to an actual situation. For example, the original game video has 35 frames per second, and the extraction can be performed at intervals of 5 frames, i.e., 7 frames per second, as the reference game picture.
203. The server identifies, for each frame of the reference game screen, position information of a game character in the game scene in the reference game screen, and game opponent information to which the game character belongs.
The game opponents refer to game camps, in general games, game characters are divided into different camps, such as red and blue, and the number of game camps is usually two or more. The antagonism between different game camps generally represents a highlight of the game.
Optionally, in some embodiments, the step of "the server identifying, for each frame of the reference game screen, location information of the game character in the game scene in the reference game screen, and game opponent information to which the game character belongs" may include:
Aiming at each frame of reference game picture, obtaining a preset life value indication template of a game role in a game scene;
identifying position information of a game character in a game scene in a reference game picture based on a sliding preset life value indication template;
and identifying the game opponent to which the game role belongs in the reference game picture to obtain the game opponent information to which the game role belongs in the reference game picture.
Wherein, in some embodiments, the step of identifying the position information of the game character in the game scene in the reference game screen based on the sliding preset life value indication template may include:
identifying the position of a life value indication template of a game role in a reference game picture based on the sliding of the preset life value indication template on the reference game picture;
and acquiring the position information of the game role in the game scene in the reference game picture based on the position of the life value indication template.
The preset life value indication template can be a blood strip template. The top of the head of the game role is provided with a life value indication template, namely a blood bar, the blood bar represents the life value or the combat value of the corresponding game role, and the appearance outline of the blood bar corresponding to each game role is the same, so that the positions of the life value indication templates corresponding to all the game roles, namely the positions of the corresponding blood bars, can be found in a game picture by a template matching method, and the positions of the game roles corresponding to the blood bars in a reference game picture can be positioned according to the positions of the blood bars.
The template matching is a technology for searching a part which is most matched (similar) with another template image in one image, and the template matching technology is applied, specifically, the part which is most matched (similar) with the blood strip template in the reference game image can be searched based on the sliding of the blood strip template on the reference game image, wherein the part is the position of the blood strip corresponding to the game role in the reference game image, and the position below the blood strip position is the position corresponding to the game role in the game scene.
The preset life value indication template, namely the blood strip template, can be set according to actual conditions, and is not limited in this embodiment. For example, since the contents such as the blood volume, color, cooling time and the like in the blood strip are continuously changed, the blood strip can be set as the blood strip template for template matching, and the blood strips of each game character can be identified as much as possible by taking the part shared by the blood strips of each game character under different conditions as the blood strip template. As shown in FIG. 1c, the schematic diagram of the blood strip template only takes the number of blood volume and digital grade boxes in front of the blood strip as the blood strip template, and the characteristics of the blood strip shape of the game character can be ensured to be detected only in the reference game picture through the blood strip template, and the situation that the blood volume of the blood strip is different at each moment is considered. In addition, the blood strip template can meet the matching of blood strips in various special scenes, for example, as shown in fig. 1d, in the case that the blood strips are cut off in the edge area in the game scene, the blood strip position of the game character can still be detected through the blood strip template, because the blood strip template only matches a part of the blood strips; other situations such as multiple game character blood sticks being folded may be identified and are not further described herein.
Optionally, in some embodiments, the step of identifying the location of the vital value indication template of the game character in the reference game picture based on the sliding of the preset vital value indication template on the reference game picture may specifically include:
calculating gray scale similarity of an overlapping area of the preset life value indication template and the reference game picture after sliding based on sliding of the preset life value indication template on the reference game picture;
based on the gray level similarity, determining a position of a life value indication template of a game character in the reference game screen.
Wherein, optionally, in some embodiments, the step of determining, based on the gray level similarity, a location of a life value indication template of a game character in the reference game screen may include:
generating a matching diagram corresponding to the reference game picture based on the gray level similarity, wherein the matching diagram comprises a plurality of pixel points, and the value of each pixel point represents the gray level similarity of a preset life value indication template in an overlapping area of the pixel point and the reference game picture;
determining pixel points with gray level similarity higher than preset similarity in the matching graph as matching pixel points;
And determining the position of a life value indication template of the game role in the reference game picture based on the position of the matched pixel point in the matched picture corresponding to the reference game picture.
The matching chart is specifically a mapping of gray level similarity of an overlapping area of the preset life value indication template and the reference game picture. The higher the gray level similarity is, the higher the matching degree of the preset life value template at the pixel point is, namely the higher the probability that the pixel point is the position where the blood stripe is located is.
Optionally, in some embodiments, the step of identifying a game opponent to which the game character belongs in the reference game picture, and obtaining the game opponent information to which the game character belongs in the reference game picture may specifically include:
acquiring color information of a life value indication template of a game character in the reference game picture;
and determining game opponent information to which the game character belongs in the reference game picture based on the color information of the life value indication template.
204. The server calculates the countermeasure parameters of the reference game picture according to the number of game characters in the game scene and the distance between the game characters when the game countermeasure party is countermeasure, so as to obtain the countermeasure parameters of the reference game picture of each frame, wherein the countermeasure parameters represent the countermeasure intensity of each game countermeasure party in the reference game picture of each frame.
Optionally, in some embodiments, the step of "referring to a game screen for each frame, calculating a countermeasure parameter of the reference game screen based on the number of game characters in the game scene and a distance between game characters when a game countermeasure is performed" may include:
for each frame of reference game picture, obtaining the distance between game characters when different game opponents fight;
calculating local countermeasure parameters of game characters in each game countermeasure party based on the distance;
and fusing the local countermeasure parameters of each game role in each game countermeasure party to obtain the countermeasure parameters of the reference game picture.
Wherein the distance between the game characters can be replaced by the distance between the blood bars corresponding to the game characters. Specifically, the distance between the game characters when different game opponents fight may be the distance of the blood bar starting point corresponding to the game characters.
205. The server cuts out a target game video clip from the original game video based on the challenge parameters of each frame of the reference game picture.
Optionally, in some embodiments, the step of "clipping out the target game video clip from the original game video based on the countermeasure parameter of each frame of the reference game picture" may include:
Dividing the original game video into a plurality of video clips according to a preset time interval;
fusing the countermeasure parameters of the reference game pictures in each video segment to obtain fused countermeasure parameters of the video segments;
and determining a target game video clip from a plurality of video clips based on the post-fusion countermeasure parameters.
206. And the server sends the target game video clip to the terminal.
As can be seen from the above, the embodiment can receive the original game video sent by the terminal through the server; extracting multi-frame reference game pictures from the original game video; identifying, for each frame of the reference game picture, position information of a game character in a game scene in the reference game picture, and game opponent information to which the game character belongs; calculating the countermeasure parameters of the reference game picture according to the number of game characters in the game scene and the distance between the game characters when the game countermeasure party is countermeasure, and obtaining the countermeasure parameters of each frame of reference game picture, wherein the countermeasure parameters represent the countermeasure intensity of each game countermeasure party in each frame of reference game picture; cutting out a target game video clip from the original game video based on the countermeasure parameters of each frame of the reference game picture; and the server sends the target game video clip to the terminal. According to the method and the device, the characteristics of game countermeasure can be utilized, the countermeasure intensity degree of each game countermeasure party is calculated based on the distances among game roles in different game countermeasure parties and the number of the game roles in the countermeasure, so that the highlight game fragments are determined, and the accuracy and the speed of the highlight game fragment clipping are greatly improved.
In order to better implement the above method, the embodiment of the present application further provides a game video clipping device, as shown in fig. 3a, which may include an acquisition unit 301, an identification unit 302, a calculation unit 303, and a clipping unit 304, as follows:
(1) An acquisition unit 301;
an acquisition unit 301 is configured to acquire an original game video, and extract a multi-frame reference game picture from the original game video.
(2) An identification unit 302;
an identification unit 302 for identifying, for each frame of the reference game screen, position information of the game character in the game scene in the reference game screen, and game opponent information to which the game character belongs.
Optionally, in some embodiments of the present application, the identifying unit 302 may include a first acquiring subunit 3021, a first identifying subunit 3022, and a second identifying subunit 3023, see fig. 3b, as follows:
the first obtaining subunit 3021 is configured to refer to the game frame for each frame, and obtain a preset life value indication template of a game character in the game scene;
a first recognition subunit 3022 for recognizing positional information of the game character in the game scene in the reference game screen based on the sliding preset life value indication template;
And a second identifying subunit 3023, configured to identify a game opponent to which the game character belongs in the reference game screen, so as to obtain game opponent information to which the game character belongs in the reference game screen.
Optionally, in some embodiments of the present application, the second identifying subunit 3023 may be specifically configured to obtain color information of a vital value indication template of the game character in the reference game screen; and determining game opponent information to which the game character belongs in the reference game picture based on the color information of the life value indication template.
Optionally, in some embodiments of the present application, the first identifying subunit 3022 may specifically be configured to identify, based on the sliding of the preset vital value indication template on the reference game screen, a location of the vital value indication template of the game character in the reference game screen; and acquiring the position information of the game role in the game scene in the reference game picture based on the position of the life value indication template.
Optionally, in some embodiments, the step of identifying the location of the vital value indication template of the game character in the reference game picture based on the sliding of the preset vital value indication template on the reference game picture may specifically include:
Calculating gray scale similarity of an overlapping area of the preset life value indication template and the reference game picture after sliding based on sliding of the preset life value indication template on the reference game picture;
based on the gray level similarity, determining a position of a life value indication template of a game character in the reference game screen.
Wherein, optionally, in some embodiments, the step of determining, based on the gray level similarity, a location of a life value indication template of a game character in the reference game screen may include:
generating a matching diagram corresponding to the reference game picture based on the gray level similarity, wherein the matching diagram comprises a plurality of pixel points, and the value of each pixel point represents the gray level similarity of a preset life value indication template in an overlapping area of the pixel point and the reference game picture;
determining pixel points with gray level similarity higher than preset similarity in the matching graph as matching pixel points;
and determining the position of a life value indication template of the game role in the reference game picture based on the position of the matched pixel point in the matched picture corresponding to the reference game picture.
(3) A calculation unit 303;
a calculating unit 303, configured to calculate, for each frame of reference game frame, a countermeasure parameter of the reference game frame based on the number of game characters in the game scene and the distance between the game characters when the game opponents are countermeasure, so as to obtain a countermeasure parameter of each frame of reference game frame, where the countermeasure parameter characterizes the countermeasure intensity of each game opponent in each frame of reference game frame.
Optionally, in some embodiments of the present application, the computing unit 303 may include a second acquiring subunit 3031, a computing subunit 3032, and a first fusion subunit 3033, see fig. 3c, as follows:
the second obtaining subunit 3031 is configured to obtain, for each frame of reference game frame, a distance between game characters when different game opponents fight;
a calculating subunit 3032, configured to calculate, based on the distance, a local countermeasure parameter of the game character in each game countermeasure party;
a first merging subunit 3033, configured to obtain the countermeasure parameters of the reference game frame by merging the local countermeasure parameters of each game character in each game countermeasure party.
Optionally, in some embodiments, the calculating subunit 3032 may be specifically configured to, for each game role in each game counterpoise, perform a weighting process on a distance between the game role in the game counterpoise and each game role in other game counterpoise, to obtain a local challenge parameter of the game role in the game counterpoise, and obtain a local challenge parameter of the game role in each game counterpoise.
(4) A shearing unit 304;
and a cropping unit 304, configured to crop out a target game video clip from the original game video based on the countermeasure parameter of each frame reference game picture.
Alternatively, in some embodiments of the present application, the clipping unit 304 may include a dividing subunit 3041, a second fusing subunit 3042, and a determining subunit 3043, see fig. 3d, as follows:
the dividing subunit 3041 is configured to divide the original game video into a plurality of video segments according to a preset time interval;
a second fusion subunit 3042, configured to fuse the countermeasure parameters of the reference game frames in each video segment to obtain fused countermeasure parameters of the video segment;
a determining subunit 3043, configured to determine a target game video segment from the plurality of video segments based on the post-fusion countermeasure parameter.
As can be seen from the above, in this embodiment, the acquiring unit 301 may acquire an original game video, and extract a plurality of frames of reference game frames from the original game video; for each frame of the reference game screen, identifying, by the identifying unit 302, position information of the game character in the reference game screen in the game scene, and game opponent information to which the game character belongs; calculating, by the calculating unit 303, for each frame of reference game picture, a countermeasure parameter of the reference game picture based on the number of game characters in the game scene and the distance between the game characters when the game opponents are countermeasure, to obtain a countermeasure parameter of each frame of reference game picture, the countermeasure parameter representing the countermeasure intensity of each game opponent in each frame of reference game picture; a target game video clip is cropped from the original game video by the cropping unit 304 based on the countermeasure parameters of each frame of the reference game picture. According to the method and the device, the characteristics of game countermeasure can be utilized, the countermeasure intensity degree of each game countermeasure party is calculated based on the distances among game roles in different game countermeasure parties and the number of the game roles in the countermeasure, so that the highlight game fragments are determined, and the accuracy and the speed of the highlight game fragment clipping are greatly improved.
The embodiment of the application also provides an electronic device, as shown in fig. 4, which shows a schematic structural diagram of the electronic device according to the embodiment of the application, specifically:
the electronic device may include one or more processing cores 'processors 401, one or more computer-readable storage media's memory 402, power supply 403, and input unit 404, among other components. Those skilled in the art will appreciate that the electronic device structure shown in fig. 4 is not limiting of the electronic device and may include more or fewer components than shown, or may combine certain components, or may be arranged in different components. Wherein:
the processor 401 is a control center of the electronic device, connects various parts of the entire electronic device using various interfaces and lines, and performs various functions of the electronic device and processes data by running or executing software programs and/or modules stored in the memory 402, and calling data stored in the memory 402. Optionally, processor 401 may include one or more processing cores; preferably, the processor 401 may integrate an application processor and a modem processor, wherein the application processor mainly processes an operating system, a user interface, an application program, etc., and the modem processor mainly processes wireless communication. It will be appreciated that the modem processor described above may not be integrated into the processor 401.
The memory 402 may be used to store software programs and modules, and the processor 401 executes various functional applications and data processing by executing the software programs and modules stored in the memory 402. The memory 402 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like; the storage data area may store data created according to the use of the electronic device, etc. In addition, memory 402 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device. Accordingly, the memory 402 may also include a memory controller to provide the processor 401 with access to the memory 402.
The electronic device further comprises a power supply 403 for supplying power to the various components, preferably the power supply 403 may be logically connected to the processor 401 by a power management system, so that functions of managing charging, discharging, and power consumption are performed by the power management system. The power supply 403 may also include one or more of any of a direct current or alternating current power supply, a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator, and the like.
The electronic device may further comprise an input unit 404, which input unit 404 may be used for receiving input digital or character information and generating keyboard, mouse, joystick, optical or trackball signal inputs in connection with user settings and function control.
Although not shown, the electronic device may further include a display unit or the like, which is not described herein. In particular, in this embodiment, the processor 401 in the electronic device loads executable files corresponding to the processes of one or more application programs into the memory 402 according to the following instructions, and the processor 401 executes the application programs stored in the memory 402, so as to implement various functions as follows:
acquiring an original game video, and extracting multi-frame reference game pictures from the original game video; identifying, for each frame of the reference game picture, position information of a game character in a game scene in the reference game picture, and game opponent information to which the game character belongs; calculating the countermeasure parameters of the reference game picture according to the number of game characters in the game scene and the distance between the game characters when the game countermeasure party is countermeasure, and obtaining the countermeasure parameters of each frame of reference game picture, wherein the countermeasure parameters represent the countermeasure intensity of each game countermeasure party in each frame of reference game picture; a target game video clip is cut from the original game video based on the countermeasure parameters of each frame reference game picture.
The specific implementation of each operation above may be referred to the previous embodiments, and will not be described herein.
As can be seen from the above, the present embodiment can obtain an original game video, and extract a plurality of frames of reference game frames from the original game video; identifying, for each frame of the reference game picture, position information of a game character in a game scene in the reference game picture, and game opponent information to which the game character belongs; calculating the countermeasure parameters of the reference game picture according to the number of game characters in the game scene and the distance between the game characters when the game countermeasure party is countermeasure, and obtaining the countermeasure parameters of each frame of reference game picture, wherein the countermeasure parameters represent the countermeasure intensity of each game countermeasure party in each frame of reference game picture; a target game video clip is cut from the original game video based on the countermeasure parameters of each frame reference game picture. According to the method and the device, the characteristics of game countermeasure can be utilized, the countermeasure intensity degree of each game countermeasure party is calculated based on the distances among game roles in different game countermeasure parties and the number of the game roles in the countermeasure, so that the highlight game fragments are determined, and the accuracy and the speed of the highlight game fragment clipping are greatly improved.
Those of ordinary skill in the art will appreciate that all or a portion of the steps of the various methods of the above embodiments may be performed by instructions, or by instructions controlling associated hardware, which may be stored in a computer-readable storage medium and loaded and executed by a processor.
To this end, embodiments of the present application provide a storage medium having stored therein a plurality of instructions capable of being loaded by a processor to perform steps in any of the game video clip methods provided by embodiments of the present application. For example, the instructions may perform the steps of:
acquiring an original game video, and extracting multi-frame reference game pictures from the original game video; identifying, for each frame of the reference game picture, position information of a game character in a game scene in the reference game picture, and game opponent information to which the game character belongs; calculating the countermeasure parameters of the reference game picture according to the number of game characters in the game scene and the distance between the game characters when the game countermeasure party is countermeasure, and obtaining the countermeasure parameters of each frame of reference game picture, wherein the countermeasure parameters represent the countermeasure intensity of each game countermeasure party in each frame of reference game picture; a target game video clip is cut from the original game video based on the countermeasure parameters of each frame reference game picture.
The specific implementation of each operation above may be referred to the previous embodiments, and will not be described herein.
Wherein the storage medium may include: read Only Memory (ROM), random access Memory (RAM, random Access Memory), magnetic or optical disk, and the like.
Because the instructions stored in the storage medium may perform the steps in any of the game video editing methods provided in the embodiments of the present application, the beneficial effects that any of the game video editing methods provided in the embodiments of the present application may be achieved, and detailed descriptions of the previous embodiments are omitted herein.
The foregoing has described in detail the methods, apparatuses, electronic devices and storage media for video editing of games provided by the embodiments of the present application, and specific examples have been applied herein to illustrate the principles and implementations of the present application, where the foregoing examples are provided to assist in understanding the methods and core ideas of the present application; meanwhile, those skilled in the art will have variations in the specific embodiments and application scope in light of the ideas of the present application, and the present description should not be construed as limiting the present application in view of the above.

Claims (11)

1. A method of video editing of a game, comprising:
acquiring an original game video, and extracting multi-frame reference game pictures from the original game video;
identifying, for each frame of the reference game picture, position information of a game character in a game scene in the reference game picture, and game opponent information to which the game character belongs;
calculating the countermeasure parameters of the reference game picture according to the number of game characters in the game scene and the distance between the game characters when the game countermeasure party is countermeasure, and obtaining the countermeasure parameters of each frame of reference game picture, wherein the countermeasure parameters represent the countermeasure intensity of each game countermeasure party in each frame of reference game picture;
dividing the original game video into a plurality of video clips according to a preset time interval;
fusing the countermeasure parameters of the reference game pictures in each video segment to obtain fused countermeasure parameters of the video segments;
and determining the video segments with the fused countermeasure parameters larger than the preset value as target game video segments.
2. The method of claim 1, wherein the identifying, for each frame of the reference game screen, the position information of the game character in the game scene in the reference game screen, and the game opponent information to which the game character belongs, comprises:
Aiming at each frame of reference game picture, obtaining a preset life value indication template of a game role in a game scene;
identifying position information of a game character in a game scene in a reference game picture based on a sliding preset life value indication template;
and identifying the game opponent to which the game role belongs in the reference game picture to obtain the game opponent information to which the game role belongs in the reference game picture.
3. The method according to claim 2, wherein the identifying the game opponent to which the game character belongs in the reference game screen, to obtain the game opponent information to which the game character belongs in the reference game screen, includes:
acquiring color information of a life value indication template of a game character in the reference game picture;
and determining game opponent information to which the game character belongs in the reference game picture based on the color information of the life value indication template.
4. The method of claim 2, wherein the identifying the location information of the game character in the game scene in the reference game screen based on the sliding preset vital value indication template comprises:
Identifying the position of a life value indication template of a game role in a reference game picture based on the sliding of the preset life value indication template on the reference game picture;
and acquiring the position information of the game role in the game scene in the reference game picture based on the position of the life value indication template.
5. The method of claim 4, wherein the identifying the location of the vital value indicating template for the character in the reference game frame based on the sliding of the preset vital value indicating template on the reference game frame comprises:
calculating gray scale similarity of an overlapping area of the preset life value indication template and the reference game picture after sliding based on sliding of the preset life value indication template on the reference game picture;
based on the gray level similarity, determining a position of a life value indication template of a game character in the reference game screen.
6. The method of claim 5, wherein determining, based on the grayscale similarity, a location of a vital value indication template of a character in the reference game screen comprises:
generating a matching diagram corresponding to the reference game picture based on the gray level similarity, wherein the matching diagram comprises a plurality of pixel points, and the value of each pixel point represents the gray level similarity of a preset life value indication template in an overlapping area of the pixel point and the reference game picture;
Determining pixel points with gray level similarity higher than preset similarity in the matching graph as matching pixel points;
and determining the position of a life value indication template of the game role in the reference game picture based on the position of the matched pixel point in the matched picture corresponding to the reference game picture.
7. The method of claim 1, wherein the calculating the challenge parameter of the reference game screen based on the number of game characters in the game scene and the distance between the game characters when the game challenge party is challenge for each frame of reference game screen comprises:
for each frame of reference game picture, obtaining the distance between game characters when different game opponents fight;
calculating local countermeasure parameters of game characters in each game countermeasure party based on the distance;
and fusing the local countermeasure parameters of each game role in each game countermeasure party to obtain the countermeasure parameters of the reference game picture.
8. The method of claim 7, wherein calculating local challenge parameters for the game characters in each game challenge based on the distance comprises:
and weighting the distances from each game role in each game opponent to each game role in other game opponents according to each game role in each game opponent, so as to obtain the local opponent parameters of the game roles in the game opponents, and obtain the local opponent parameters of the game roles in each game opponent.
9. A game video editing device, comprising:
the acquisition unit is used for acquiring an original game video and extracting multi-frame reference game pictures from the original game video;
an identification unit configured to identify, for each frame of the reference game screen, position information of a game character in a game scene in the reference game screen, and game opponent information to which the game character belongs;
a calculating unit, configured to calculate, for each frame of reference game frame, a countermeasure parameter of the reference game frame based on the number of game characters in the game scene and a distance between the game characters when the game countermeasure is countered, to obtain a countermeasure parameter of each frame of reference game frame, where the countermeasure parameter characterizes a countermeasure intensity of each game countermeasure in each frame of reference game frame;
the cutting unit is used for dividing the original game video into a plurality of video clips according to a preset time interval; fusing the countermeasure parameters of the reference game pictures in each video segment to obtain fused countermeasure parameters of the video segments; and determining the video segments with the fused countermeasure parameters larger than the preset value as target game video segments.
10. An electronic device comprising a memory and a processor; the memory stores a plurality of instructions that are loaded by the processor to perform the steps in the game video clip method of any one of claims 1 to 8.
11. A storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of performing the method of game video editing of any of claims 1 to 8.
CN202010302098.1A 2020-04-16 2020-04-16 Game video editing method and device Active CN111491179B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010302098.1A CN111491179B (en) 2020-04-16 2020-04-16 Game video editing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010302098.1A CN111491179B (en) 2020-04-16 2020-04-16 Game video editing method and device

Publications (2)

Publication Number Publication Date
CN111491179A CN111491179A (en) 2020-08-04
CN111491179B true CN111491179B (en) 2023-07-14

Family

ID=71812832

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010302098.1A Active CN111491179B (en) 2020-04-16 2020-04-16 Game video editing method and device

Country Status (1)

Country Link
CN (1) CN111491179B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112188268B (en) * 2020-09-25 2022-06-07 腾讯科技(深圳)有限公司 Virtual scene display method, virtual scene introduction video generation method and device
CN117499701B (en) * 2023-12-29 2024-03-12 景色智慧(北京)信息科技有限公司 Method and device for realizing riding game lens close-up and electronic equipment

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102145228B (en) * 2010-02-05 2015-04-29 Pc概念有限公司 Method and device for constructing interactive video game by using video records
CN110019951B (en) * 2017-09-29 2021-06-04 华为软件技术有限公司 Method and equipment for generating video thumbnail
CN109672922B (en) * 2017-10-17 2020-10-27 腾讯科技(深圳)有限公司 Game video editing method and device
CN108234825A (en) * 2018-01-12 2018-06-29 广州市百果园信息技术有限公司 Method for processing video frequency and computer storage media, terminal
CN108259990B (en) * 2018-01-26 2020-08-04 腾讯科技(深圳)有限公司 Video editing method and device

Also Published As

Publication number Publication date
CN111491179A (en) 2020-08-04

Similar Documents

Publication Publication Date Title
CN111491173B (en) Live cover determination method and device, computer equipment and storage medium
CN108829900B (en) Face image retrieval method and device based on deep learning and terminal
CN111013150B (en) Game video editing method, device, equipment and storage medium
US20140257995A1 (en) Method, device, and system for playing video advertisement
CN111744187B (en) Game data processing method and device, computer and readable storage medium
CN111491179B (en) Game video editing method and device
JP2010009425A (en) Image processor, image processing method, and computer program
CN112232258A (en) Information processing method and device and computer readable storage medium
CN112381104A (en) Image identification method and device, computer equipment and storage medium
CN112827168B (en) Target tracking method, device and storage medium
CN111643900A (en) Display picture control method and device, electronic equipment and storage medium
CN111405360B (en) Video processing method and device, electronic equipment and storage medium
CN111429476B (en) Method and device for determining action track of target person
CN111368751A (en) Image processing method, image processing device, storage medium and electronic equipment
CN113238972B (en) Image detection method, device, equipment and storage medium
CN112150349A (en) Image processing method and device, computer equipment and storage medium
CN112287848A (en) Live broadcast-based image processing method and device, electronic equipment and storage medium
CN114095742A (en) Video recommendation method and device, computer equipment and storage medium
CN111035933B (en) Abnormal game detection method and device, electronic equipment and readable storage medium
CN109939439B (en) Virtual character blocking detection method, model training method, device and equipment
CN112131426B (en) Game teaching video recommendation method and device, electronic equipment and storage medium
CN112150444A (en) Method and device for identifying attribute features of face image and electronic equipment
US20220005336A1 (en) Information processing system, information processing apparatus, and information processing method
CN113497946A (en) Video processing method and device, electronic equipment and storage medium
CN116152932A (en) Living body detection method and related equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant