CN110694264A - Game video generation method, terminal and readable storage medium - Google Patents

Game video generation method, terminal and readable storage medium Download PDF

Info

Publication number
CN110694264A
CN110694264A CN201910966756.4A CN201910966756A CN110694264A CN 110694264 A CN110694264 A CN 110694264A CN 201910966756 A CN201910966756 A CN 201910966756A CN 110694264 A CN110694264 A CN 110694264A
Authority
CN
China
Prior art keywords
game
video
video file
information
generating
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910966756.4A
Other languages
Chinese (zh)
Inventor
廖松茂
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nubia Technology Co Ltd
Nanchang Nubia Technology Co Ltd
Original Assignee
Nubia Technology Co Ltd
Nanchang Nubia Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nubia Technology Co Ltd, Nanchang Nubia Technology Co Ltd filed Critical Nubia Technology Co Ltd
Priority to CN201910966756.4A priority Critical patent/CN110694264A/en
Publication of CN110694264A publication Critical patent/CN110694264A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/35Details of game servers
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/433Content storage operation, e.g. storage operation in response to a pause request, caching operations
    • H04N21/4334Recording operations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4781Games

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Telephone Function (AREA)

Abstract

The embodiment of the invention discloses a method for generating a game video, which comprises the following steps: if an instruction for generating a game video is detected, searching a corresponding first video file according to the instruction; if the first video file exists, analyzing the identification information of the first video file; judging whether the identification information is matched with the current game running state; and if so, generating a second video file according to the first video file and the current game running state. The embodiment of the invention also provides a game video generation terminal and a readable storage medium. The embodiment of the invention realizes intelligent recognition of the game state and continuous recording of unfinished game video under the condition that the game application is restarted, ensures the integrity of the game video and improves the game experience of users.

Description

Game video generation method, terminal and readable storage medium
Technical Field
The embodiment of the invention relates to the technical field of terminals, in particular to a game video generation method, a terminal and a readable storage medium.
Background
At present, mobile terminals are increasingly used for games, the video recording function built in the existing game application is to store game wonderful pictures at a server end, and when the game is finished, a user can select to store game wonderful videos. However, after the mobile terminal or the game application is restarted, the related game service is also suspended, the game highlight video at the server end is automatically deleted, and even if the user logs in the game and starts the recording function again, the user cannot obtain the complete game video when the game is finished, so that a large amount of highlight operations of the user cannot be recorded, and the game experience of the user is seriously influenced.
Disclosure of Invention
The embodiment of the invention mainly aims to provide a game video generation method, a terminal and a readable storage medium, and aims to solve the technical problem that the existing game video cannot be recorded continuously after being interrupted, so that the user experience is poor.
In order to achieve the above object, an embodiment of the present invention provides a method for generating a game video, including:
if an instruction for generating a game video is detected, searching a corresponding first video file according to the instruction;
if the first video file exists, analyzing the identification information of the first video file;
judging whether the identification information is matched with the current game running state;
and if so, generating a second video file according to the first video file and the current game running state.
Optionally, if an instruction for generating a game video is detected, before searching for a corresponding first video file according to the instruction, the method further includes:
reading recorded video information on a server;
storing the recorded video information to the local;
and generating a first video file according to the recorded video information.
Optionally, if an instruction for generating a game video is detected, before searching for a corresponding first video file according to the instruction, the method further includes:
detecting user operation or application scene information in the game process;
when the user operation or application scene information meets a preset condition, executing a screen recording instruction to generate recorded video information;
and generating a first video file according to the recorded video information.
Optionally, if an instruction for generating a game video is detected, the step of searching for the corresponding first video file according to the instruction includes:
monitoring the progress of the game application and an operation instruction of a user, and judging whether a game recording function is started or not;
if the game recording function is started, searching a corresponding first video file according to the name of the game video file or other associated identifiers.
Optionally, the step of parsing the identification information of the first video file includes:
reading identification information of the first video file, wherein the identification information comprises time node information and interface information of a game application;
analyzing time node information recorded in the identification information of the first video file, wherein the time node information comprises a start time and an end time;
or analyzing interface information of the game application, wherein the interface information comprises a starting interface, an ending interface and an interface meeting preset conditions.
Optionally, the step of determining whether the identification information matches the current game running state includes:
acquiring and comparing the time node information with the time information of the current game operation;
and if the time node information and the time information of the current game operation are within a preset time interval range, determining that the first video file is matched with the current game operation state.
Optionally, the step of determining whether the identification information matches the current game running state includes:
acquiring and comparing the interface information with a current game running interface, wherein the interface information comprises character elements, graphic elements and color elements;
and if the similarity of the two elements meets a preset condition, determining that the first video file is matched with the current game running state.
Optionally, if the first video file and the second video file are matched, the step of generating the second video file according to the first video file and the current game running state includes:
acquiring attribute information of a first video file;
generating a third video file according to the attribute information and the current game running state;
synthesizing the first video file and the third video file according to a preset sequence to generate a second video file;
and generating new identification information according to the second video file.
In addition, to achieve the above object, a second aspect of the embodiments of the present invention provides a mobile terminal, where the mobile terminal includes: a memory, a processor and a game video generation program stored on the memory and executable on the processor, the game video generation program when executed by the processor implementing the steps of the game video generation method as described above.
In addition, to achieve the above object, a third aspect of the embodiments of the present invention provides a readable storage medium, in which a game video generation program is stored, and when executed by a processor, the game video generation program implements the steps of the game video generation method described above.
The embodiment of the invention provides a game video generation method, a terminal and a readable storage medium, wherein if an instruction for generating a game video is detected, a corresponding first video file is searched according to the instruction; if the first video file exists, analyzing the identification information of the first video file; judging whether the identification information is matched with the current game running state; and if so, generating a second video file according to the first video file and the current game running state. According to the embodiment of the invention, the recording picture before the game video can be loaded to continue recording after the mobile terminal or the application program is restarted, the integrity of the game video is ensured, and the game experience of a user is improved.
Drawings
Fig. 1 is a schematic diagram of a hardware structure of an alternative mobile terminal according to an embodiment of the present invention;
FIG. 2 is a diagram of a wireless communication device of the mobile terminal of FIG. 1;
fig. 3 is a flowchart of a method of generating a game video according to a first embodiment of the present invention.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the embodiments of the invention and are not limiting of the embodiments of the invention.
In the following description, suffixes such as "module", "component", or "unit" used to denote elements are used only for facilitating the description of the embodiments of the present invention, and have no specific meaning in themselves. Thus, "module", "component" or "unit" may be used mixedly.
The terminal may be implemented in various forms. For example, the terminal described in the embodiment of the present invention may include terminals such as a mobile phone, a tablet computer, a notebook computer, a palmtop computer, a Personal Digital Assistant (PDA), a Portable Media Player (PMP), a navigation device, a wearable device, a smart band, a pedometer, and the like, and fixed terminals such as a Digital TV, a desktop computer, and the like.
While the terminal will be described as an example in the following description, those skilled in the art will appreciate that the configurations according to the embodiments of the present invention can be applied to a fixed type terminal in addition to elements particularly used for moving purposes.
Referring to fig. 1, which is a schematic diagram of a hardware structure of a mobile terminal for implementing various embodiments of the present invention, the mobile terminal 100 may include: RF (Radio Frequency) unit 101, WiFi module 102, audio output unit 103, a/V (audio/video) input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111. Those skilled in the art will appreciate that the mobile terminal architecture shown in fig. 1 is not intended to be limiting of mobile terminals, which may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The following describes each component of the mobile terminal in detail with reference to fig. 1:
the radio frequency unit 101 may be configured to receive and transmit signals during information transmission and reception or during a call, and specifically, receive downlink information of a base station and then process the downlink information to the processor 110; in addition, the uplink data is transmitted to the base station. Typically, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with a network and other devices through wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to GSM (global system for mobile communications), GPRS (General Packet Radio Service), CDMA2000(Code Division multiple Access 2000), WCDMA (Wideband Code Division multiple Access), TD-SCDMA (Time Division-Synchronous Code Division multiple Access), FDD-LTE (Frequency Division duplex-Long Term Evolution), and TDD-LTE (Time Division duplex-Long Term Evolution).
WiFi belongs to short-distance wireless transmission technology, and the mobile terminal can help a user to receive and send e-mails, browse webpages, access streaming media and the like through the WiFi module 102, and provides wireless broadband internet access for the user. Although fig. 1 shows the WiFi module 102, it is understood that it does not belong to the essential constitution of the mobile terminal, and may be omitted entirely as needed within the scope not changing the essence of the invention.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the WiFi module 102 or stored in the memory 109 into an audio signal and output as sound when the mobile terminal 100 is in a call signal reception mode, a call mode, a recording mode, a voice recognition mode, a broadcast reception mode, or the like. Also, the audio output unit 103 may also provide audio output related to a specific function performed by the mobile terminal 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 103 may include a speaker, a buzzer, and the like.
The a/V input unit 104 is used to receive audio or video signals. The a/V input Unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, the Graphics processor 1041 Processing image data of still pictures or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphic processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the WiFi module 102. The microphone 1042 may receive sounds (audio data) via the microphone 1042 in a phone call mode, a recording mode, a voice recognition mode, or the like, and may be capable of processing such sounds into audio data. The processed audio (voice) data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 101 in case of a phone call mode. The microphone 1042 may implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated in the course of receiving and transmitting audio signals.
The mobile terminal 100 also includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 1061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 1061 and/or a backlight when the mobile terminal 100 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the posture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, further description is omitted here.
The display unit 106 is used to display information input by a user or information provided to the user. The Display unit 106 may include a Display panel 1061, and the Display panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal. Specifically, the user input unit 107 may include a touch panel 1071 and other input devices 1072. The touch panel 1071, also referred to as a touch screen, may collect a touch operation performed by a user on or near the touch panel 1071 (e.g., an operation performed by the user on or near the touch panel 1071 using a finger, a stylus, or any other suitable object or accessory), and drive a corresponding connection device according to a predetermined program. The touch panel 1071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 110, and can receive and execute commands sent by the processor 110. In addition, the touch panel 1071 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may include other input devices 1072. In particular, other input devices 1072 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like, and are not limited to these specific examples.
Further, the touch panel 1071 may cover the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although the touch panel 1071 and the display panel 1061 are shown in fig. 1 as two separate components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the mobile terminal, and is not limited herein.
The interface unit 108 serves as an interface through which at least one external device is connected to the mobile terminal 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the mobile terminal 100 or may be used to transmit data between the mobile terminal 100 and external devices.
The memory 109 may be used to store a software program and various data, and the memory 109 may be a computer storage medium, and the memory 109 stores a game video generation program according to an embodiment of the present invention. The memory 109 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal and processes data by operating or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby performing overall monitoring of the mobile terminal. For example, the processor 110 executes the game video generation program in the memory 109 to implement the steps of the embodiments of the game video generation method according to the embodiments of the present invention.
Processor 110 may include one or more processing units; alternatively, the processor 110 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The mobile terminal 100 may further include a power supply 111 (e.g., a battery) for supplying power to various components, and optionally, the power supply 111 may be logically connected to the processor 110 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system.
Although not shown in fig. 1, the mobile terminal 100 may further include a bluetooth module or the like, which is not described in detail herein.
In order to facilitate understanding of the embodiment of the present invention, a communication network system on which the mobile terminal according to the embodiment of the present invention is based is described below.
Referring to fig. 2, fig. 2 is an architecture diagram of a communication Network system according to an embodiment of the present invention, where the communication Network system is an LTE system of a universal mobile telecommunications technology, and the LTE system includes a UE (User Equipment) 201, an E-UTRAN (Evolved UMTS Terrestrial Radio Access Network) 202, an EPC (Evolved Packet Core) 203, and an IP service 204 of an operator, which are in communication connection in sequence.
Specifically, the UE201 may be the terminal 100 described above, and is not described herein again.
The E-UTRAN202 includes eNodeB2021 and other eNodeBs 2022, among others. Among them, the eNodeB2021 may be connected with other eNodeB2022 through backhaul (e.g., X2 interface), the eNodeB2021 is connected to the EPC203, and the eNodeB2021 may provide the UE201 access to the EPC 203.
The EPC203 may include an MME (mobility management Entity) 2031, an HSS (Home Subscriber Server) 2032, other MMEs 2033, an SGW (Serving gateway) 2034, a PGW (PDN gateway) 2035, and a PCRF (Policy and charging functions Entity) 2036, and the like. The MME2031 is a control node that handles signaling between the UE201 and the EPC203, and provides bearer and connection management. HSS2032 is used to provide registers to manage functions such as home location register (not shown) and holds subscriber specific information about service characteristics, data rates, etc. All user data may be sent through SGW2034, PGW2035 may provide IP address assignment for UE201 and other functions, and PCRF2036 is a policy and charging control policy decision point for traffic data flow and IP bearer resources, which selects and provides available policy and charging control decisions for a policy and charging enforcement function (not shown).
The IP services 204 may include the internet, intranets, IMS (IP Multimedia Subsystem), or other IP services, among others.
Although the LTE system is described as an example, it should be understood by those skilled in the art that the embodiments of the present invention are not limited to the LTE system, but may also be applied to other wireless communication systems, such as GSM, CDMA2000, WCDMA, TD-SCDMA, and future new network systems.
Based on the above terminal hardware structure and communication network system, the present invention provides various embodiments of the method. The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a computer-readable storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
As shown in the terminal structure of fig. 1, the processor 110 may call the generation program of the game video stored in the memory 109, and execute the generation method of the game video provided by the embodiment of the present invention.
The embodiment of the invention further provides a game video generation method. Referring to fig. 3, fig. 3 is a flowchart of a method of generating a game video according to a first embodiment of the present invention.
The invention provides a game video generation method, which is characterized in that when a mobile terminal or a game application starts a game video recording function, the recorded game video is stored locally, when an instruction for generating the game video is detected, a corresponding first video file is searched according to the instruction, identification information of the first video file is analyzed, whether the identification information is matched with the current game running state or not is judged, and if the identification information is matched with the current game running state, a second video file is generated according to the first video file and the current game running state.
Specifically, in this embodiment, the method for generating a game video includes:
step S300, if an instruction for generating a game video is detected, searching a corresponding first video file according to the instruction;
in this embodiment, the game video refers to a screen recording recorded by a game application or a terminal device and recording a highlight in the game. The wonderful moment is the moment when a game player performs wonderful operation, and can be mainly expressed as the hitting and killing or attack assisting actions in shooting and fighting games, the overtaking or drifting actions in racing games and other high-difficulty actions. It should be noted that the game video in the embodiment of the present application is not limited to recording a specific highlight, and may also be recording the whole game process. The instruction for generating the game video can be that the game application is automatically started according to a preset condition or that the game application is automatically started by the user. And judging whether the game recording function is started or not by monitoring the progress of the game application program and the operation instruction of the user. Monitoring the progress of the game application and the user's actions may be accomplished by embedding points in the progress of the game application.
Specifically, when the recording function of the game picture is monitored to be started, the terminal device searches files in the memory and searches whether game video files related to the corresponding game application exist or not. If the relevant game video file does not exist in the memory, the terminal equipment creates a folder corresponding to the game application and stores the folder after the game video recording is finished. The folder can be named in the same way as the game name, and a special association identifier can be generated according to a preset condition. If the game recording function is started, searching a corresponding first video file according to the name of the game video file or other associated identifiers. The game video file name can comprise a game full name and an abbreviation, and other associated identifiers can be a game application ID number, a recording time or a file serial number and the like. Specifically, when an instruction for generating a game video is detected, a corresponding folder is searched according to a video file storage path, where the corresponding folder may be a folder having the same name as the game application or having a special association identifier. And then screening the video files in the corresponding folders to find out the corresponding first video file. In addition, the first video file is a game video file that is obtained by recording a specific game application screen and satisfies a preset time condition. The preset time condition means that the whole recording duration of the existing game video file is lower than a preset threshold value according to the game characteristics of the specific game application, and the recording ending time of the video file and the starting time of the current game video recording function do not exceed the preset threshold value. For example, in a royal glory game, the average time of the game play generally does not exceed one hour, and if the existing game video recording end time exceeds one hour from the time when the current game video recording function is started, the preset time condition is not met. By standardizing the name of the generated game video file, whether the first video file which is not recorded completely exists can be determined more quickly and accurately.
Step S301, if the first video file exists, analyzing the identification information of the first video file;
in this embodiment, the first video file may directly include the video information and the identification information, or may generate an independent identification information file corresponding to each game video file. After the first video file is determined, reading the identification information of the first video file, wherein the identification information comprises time node information and interface information of a game application, and can also comprise information of abnormal interruption of game video recording.
Analyzing time node information recorded in the identification information of the first video file, wherein the time node information comprises the starting time and the ending time of game video recording or comprises game video recording duration information.
Or analyzing interface information of the game application recorded in the identification information of the first video file, wherein the interface information comprises a starting interface, an ending interface and an interface meeting preset conditions. Specifically, the starting interface of the game may be a preparation interface when the game is loaded for a game, and elements such as characters and scenes of a specific game are described. The game ending interface can be a settlement interface of game games, and elements such as wins and wins, awards and the like of specific games are recorded. The interface meeting the preset condition may be an interface when the game progresses to a specific stage, or when a specific element appears, for example, an interface when a specific level is entered, or an interface when a click-through or attack-assistance prompt appears, and the like, and is specifically set according to different game applications. By analyzing the identification information of the first video file, the efficiency of comparison between the first video file and the current game running state is improved.
Step S302, judging whether the identification information is matched with the current game running state;
in this embodiment, the game running status includes, but is not limited to, the game-play is on, the game-play continues, and the game-play is completely finished. The current game running state refers to a running state of game play when the game recording service is started, and includes but is not limited to starting of new game play, continuing of game play, complete ending of game play and the like. And judging whether the first video file is matched with the current game running state according to the content recorded by the identification information, wherein the matching does not mean that the game running state recorded by the first video file is completely the same as the current game running state, but means that whether the game play of the specific game recorded by the first video file is not finished. Specifically, the time node information and the time information of the current game operation are obtained and compared, or the interface information and the operation interface of the current game are obtained. And then, comparing according to preset conditions, and judging whether the specific game match recorded in the first video file is the same as the game match in the current game.
And if the time node information and the current game running time information are in a preset time interval range, determining that the first video file is matched with the current game running state, namely determining that the game match recorded in the first video file and the current game are the same game. For example, a game time of one game needs at least thirty minutes, but the recording duration of the game video in the time node information is far less than thirty minutes, and the interval between the ending time in the time node information and the starting time of the current game running is less than a preset value, which indicates that the probability that the game is not ended is higher, and the game described in the first video file has a higher probability that the game and the current game belong to the same game.
Or, the interface information and the current game running interface are obtained and compared, and the interface information comprises character elements, graphic elements and color elements. The interface information and the current game running interface can be obtained by embedding points in the game application process, and the interface screenshot of the game application is obtained when the related game service function is started, or is directly recorded in the game video. And if the similarity of the two elements meets a preset condition, determining that the first video file is matched with the current game running state. For example, the royal joker of the game application reloads an unfinished game after the application is restarted, the same game loading interface as that at the beginning of the game appears at the moment, and if the interface information is the same as that loaded by the current game, the identification information is judged to be matched with the running state of the current game. It should be noted that the similarity satisfies the preset condition, and may be that the proportion of the same element reaches a certain proportion, for example, the proportion of the element where the two elements are similar reaches seventy percent, and the proportion may be modified according to different game applications.
In other embodiments, some game applications may automatically prompt the user whether to load the previous game, and when the text information is obtained, it may be determined that the identification information matches the current game running state. Or the game interface information includes scene information of the game, and may specifically include a game login scene, a game loading scene, a game store scene, a game battle scene, and the like, which is not limited herein. It can be understood that each game scene corresponds to a different game scene ID, that is, the game scene ID and the game scene have a unique correspondence, so in the embodiment of the present invention, the game scene ID when the electronic device displays the game screen may be obtained, the game scene corresponding to the game screen may be obtained through the game scene ID, and the specific game running state may be determined accordingly. The judgment is carried out through the scene ID, the background processing capacity is improved, the comparison speed is higher, and the range is wider.
It should be noted that the game play in the embodiment of the present invention is not limited to the game running state presented in other manners, for example, the game is not divided into a plurality of single competitive competitions, but is continuously played in a linear or non-linear manner. And judging whether the identification information is matched with the current game running state or not, wherein the judgment can be carried out according to the time node information, the judgment can also be carried out according to the interface information, or the judgment can be carried out in a mode of combining the time node information and the interface information, so that the judgment accuracy is improved.
And step S303, if the first video file is matched with the current game running state, generating a second video file according to the first video file and the current game running state.
In the present embodiment, attribute information of the first video file is acquired, the attribute information including an end time and an end screen of the first video file. And acquiring the current game running state, and generating a third video file according to the attribute information and the current game running state. And after the third video file is generated, synthesizing the first video file and the third video file according to a preset sequence to generate a second video file. The second video file is a target game video to be generated by the embodiment of the present invention. After the target game video is generated, the identification information of the first video file, the third video file and the identification information of the first video file and the third video file can be reserved or directly deleted, new identification information is generated according to the information of the second video file, or the new identification information is added after the new identification information is modified on the basis of the identification information of the first video file.
It should be noted that the third video file is also generated by recording the game picture, and the third video file may be recorded continuously on the basis of the first video file directly or may be an independent video file different from the first video file. If the third video file is obtained by continuously recording on the basis of the first video file, the third video file is reserved or directly deleted after the target game video is generated, and original identification information is modified or new identification information is generated.
In the method for generating the game video, provided by the embodiment of the invention, if an instruction for generating the game video is detected, a corresponding first video file is searched according to the instruction; if the first video file exists, analyzing the identification information of the first video file; judging whether the identification information is matched with the current game running state; and if so, generating a second video file according to the first video file and the current game running state. The embodiment of the invention realizes that the game video can be loaded to the previous recording picture for continuous recording after the mobile terminal or the application program is restarted, thereby preventing the situation that the game video recording is incomplete because the mobile terminal or the application program is restarted. Meanwhile, the game video is locally stored, so that the safety of the game video file is effectively improved, the accuracy of continuously recording the game video is improved by comparing the time node information with the interface information, the integrity of the game video is guaranteed to the greatest extent, and the game experience of a user is improved.
Based on the first embodiment, a second embodiment of the game video generation method of the present invention is provided, in this embodiment, before step S300, the method further includes:
the generation of the first video file may be reading recorded video information on a server, storing the recorded video information to a local, and then generating the first video file. Or, the generation of the first video file may be that in the game process, user operation or application scene information is detected, when the user operation or application scene information meets a preset condition, a screen recording instruction is executed to generate recorded video information, and then the first video file is generated according to the recorded video information.
In addition, an embodiment of the present invention further provides a terminal, where the terminal includes: the game video generation method comprises a memory, a processor and a game video generation program which is stored on the memory and can run on the processor, wherein the steps of the embodiments of the game video generation method are realized when the game video generation program is executed by the processor.
In addition, an embodiment of the present invention further provides a computer-readable storage medium, where one or more programs are stored, and the one or more programs are further executable by one or more processors to implement the steps of the embodiments of the game video generation method.
The specific implementation of the terminal and the readable storage medium (i.e., the computer readable storage medium) in the embodiments of the present invention corresponds to the steps in the embodiments of the game video generation method, and details are not repeated here.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the embodiments of the present invention may be essentially or partially implemented in the form of a software product, which is stored in a computer-readable storage medium (e.g., ROM/RAM, magnetic disk, optical disk) as described above and includes several instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the methods according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes performed by the embodiments of the present invention and the contents of the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (10)

1. A method for generating a game video, comprising the steps of:
if an instruction for generating a game video is detected, searching a corresponding first video file according to the instruction;
if the first video file exists, analyzing the identification information of the first video file;
judging whether the identification information is matched with the current game running state;
and if so, generating a second video file according to the first video file and the current game running state.
2. The method as claimed in claim 1, wherein before searching for the corresponding first video file according to the instruction if the instruction for generating the game video is detected, the method further comprises:
reading recorded video information on a server;
storing the recorded video information to the local;
and generating a first video file according to the recorded video information.
3. The method as claimed in claim 1, wherein before searching for the corresponding first video file according to the instruction if the instruction for generating the game video is detected, the method further comprises:
detecting user operation or application scene information in the game process;
when the user operation or application scene information meets a preset condition, executing a screen recording instruction to generate recorded video information;
and generating a first video file according to the recorded video information.
4. The method as claimed in claim 1, wherein the step of searching for the corresponding first video file according to the instruction if the instruction for generating the game video is detected comprises:
monitoring the progress of the game application and an operation instruction of a user, and judging whether a game recording function is started or not;
if the game recording function is started, searching a corresponding first video file according to the name of the game video file or other associated identifiers.
5. A game video generation method according to any one of claims 1 to 4, wherein the step of parsing the identification information of the first video file includes:
reading identification information of the first video file, wherein the identification information comprises time node information and interface information of a game application;
analyzing time node information recorded in the identification information of the first video file, wherein the time node information comprises a start time and an end time;
or analyzing interface information of the game application, wherein the interface information comprises a starting interface, an ending interface and an interface meeting preset conditions.
6. The game video generation method of claim 5, wherein the step of determining whether the identification information matches a current game running state comprises:
acquiring and comparing the time node information with the time information of the current game operation;
and if the time node information and the time information of the current game operation are within a preset time interval range, determining that the first video file is matched with the current game operation state.
7. The game video generation method of claim 5, wherein the step of determining whether the identification information matches a current game running state comprises:
acquiring and comparing the interface information with a current game running interface, wherein the interface information comprises character elements, graphic elements and color elements;
and if the similarity of the two elements meets a preset condition, determining that the first video file is matched with the current game running state.
8. The method for generating game video according to claim 1, wherein if there is a match, the step of generating the second video file according to the first video file and the current game running status comprises:
acquiring attribute information of a first video file;
generating a third video file according to the attribute information and the current game running state;
synthesizing the first video file and the third video file according to a preset sequence to generate a second video file;
and generating new identification information according to the second video file.
9. A mobile terminal, characterized in that the mobile terminal comprises: a memory, a processor and a game video generation program stored on the memory and executable on the processor, the game video generation program, when executed by the processor, implementing the steps of the game video generation method according to any one of claims 1 to 8.
10. A readable storage medium, characterized in that the readable storage medium has stored thereon a generation program of a game video, which when executed by a processor realizes the steps of the generation method of a game video according to any one of claims 1 to 8.
CN201910966756.4A 2019-10-12 2019-10-12 Game video generation method, terminal and readable storage medium Pending CN110694264A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910966756.4A CN110694264A (en) 2019-10-12 2019-10-12 Game video generation method, terminal and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910966756.4A CN110694264A (en) 2019-10-12 2019-10-12 Game video generation method, terminal and readable storage medium

Publications (1)

Publication Number Publication Date
CN110694264A true CN110694264A (en) 2020-01-17

Family

ID=69198532

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910966756.4A Pending CN110694264A (en) 2019-10-12 2019-10-12 Game video generation method, terminal and readable storage medium

Country Status (1)

Country Link
CN (1) CN110694264A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113297065A (en) * 2020-11-16 2021-08-24 阿里巴巴集团控股有限公司 Data processing method, game-based processing method and device and electronic equipment
CN114173182A (en) * 2022-01-17 2022-03-11 广州博冠信息科技有限公司 Game video recording method and device and game video reading method and device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113297065A (en) * 2020-11-16 2021-08-24 阿里巴巴集团控股有限公司 Data processing method, game-based processing method and device and electronic equipment
CN114173182A (en) * 2022-01-17 2022-03-11 广州博冠信息科技有限公司 Game video recording method and device and game video reading method and device

Similar Documents

Publication Publication Date Title
CN107297074B (en) Game video recording method, terminal and storage medium
CN109701266B (en) Game vibration method, device, mobile terminal and computer readable storage medium
CN108540358B (en) Control method of household appliance, mobile terminal and computer readable storage medium
CN108479066B (en) False touch prevention method, mobile terminal and computer readable storage medium
CN112346685B (en) Screen-off and screen-projection control method and device and computer readable storage medium
CN110180181B (en) Method and device for capturing wonderful moment video and computer readable storage medium
CN110187808B (en) Dynamic wallpaper setting method and device and computer-readable storage medium
CN109151216B (en) Application starting method, mobile terminal, server and computer readable storage medium
CN108536383B (en) Game control method, game control equipment and computer readable storage medium
CN114761926A (en) Information acquisition method, terminal and computer storage medium
CN112637410A (en) Method, terminal and storage medium for displaying message notification
CN112346824B (en) Screen projection application control method and device and computer readable storage medium
CN110694264A (en) Game video generation method, terminal and readable storage medium
CN108260009B (en) Video processing method, mobile terminal and computer readable storage medium
CN110099173B (en) Touch experience mode switching method, terminal and storage medium
CN112494932A (en) Game frame supplementing method and device, terminal and computer readable storage medium
CN112492340A (en) Live broadcast audio acquisition method, mobile terminal and computer readable storage medium
CN110187934B (en) Application control method, terminal and computer readable storage medium
CN108563528B (en) Application control method, terminal and computer storage medium
CN111399739B (en) Touch event conversion processing method, terminal and computer readable storage medium
CN110413415B (en) Memory management control method, equipment and computer readable storage medium
CN110399083B (en) Game space starting method, terminal and computer readable storage medium
CN110069161B (en) Screen recognition method, mobile terminal and computer-readable storage medium
CN110262707B (en) Application program operation recording method and device and computer readable storage medium
CN109151164B (en) Message shielding method, terminal and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination