CN117224942A - Game interaction method and device and electronic equipment - Google Patents

Game interaction method and device and electronic equipment Download PDF

Info

Publication number
CN117224942A
CN117224942A CN202311333976.6A CN202311333976A CN117224942A CN 117224942 A CN117224942 A CN 117224942A CN 202311333976 A CN202311333976 A CN 202311333976A CN 117224942 A CN117224942 A CN 117224942A
Authority
CN
China
Prior art keywords
game
video
interface
input
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311333976.6A
Other languages
Chinese (zh)
Inventor
陈曦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202311333976.6A priority Critical patent/CN117224942A/en
Publication of CN117224942A publication Critical patent/CN117224942A/en
Pending legal-status Critical Current

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a game interaction method, a game interaction device and electronic equipment, belonging to the technical field of electronic equipment, wherein the method comprises the following steps: receiving a first input to a game interface; responding to the first input, displaying a video interface on the same screen as the game interface, wherein the video interface comprises at least one game video identifier, and the game video corresponding to the at least one game video identifier is obtained by searching according to game information, and the game information corresponds to the current game progress; receiving a second input of a target game video identification of the at least one game video identification; and responding to the second input, and playing the target game video corresponding to the target game video identifier on the video interface.

Description

Game interaction method and device and electronic equipment
Technical Field
The application belongs to the technical field of electronic equipment, and particularly relates to a game interaction method and device and electronic equipment.
Background
Currently, in online games such as adventure, strategy, puzzle, and role playing, many levels or tasks are accomplished by means of game attack videos. In the related art, when searching for a game attack video, a user needs to first exit a currently running game interface, then open a search interface of other application programs to search the game attack video, select a required game attack video from search results, and then open a game before the game interface continues after the game attack video is checked, so that the operation is complicated.
Disclosure of Invention
The embodiment of the application aims to provide a game interaction method, a game interaction device and electronic equipment, which can avoid the tedious operation of viewing a game attack video in the game process.
In a first aspect, an embodiment of the present application provides a game interaction method, where the method includes:
receiving a first input to a game interface;
responding to the first input, displaying a video interface on the same screen as the game interface, wherein the video interface comprises at least one game video identifier, and the game video corresponding to the at least one game video identifier is obtained by searching according to game information, and the game information corresponds to the current game progress;
receiving a second input of a target game video identification of the at least one game video identification;
and responding to the second input, and playing the target game video corresponding to the target game video identifier on the video interface.
In a second aspect, an embodiment of the present application provides a game interaction device, including:
a first receiving module for receiving a first input to the game interface;
the display module is used for responding to the first input, displaying a video interface with the game interface on the same screen, wherein the video interface comprises at least one game video identifier, the game video corresponding to the at least one game video identifier is obtained by searching according to game information, and the game information corresponds to the current game progress;
A second receiving module for receiving a second input of a target game video identification of the at least one game video identification;
and the first playing module is used for responding to the second input and playing the target game video corresponding to the target game video identifier on the video interface.
In a third aspect, an embodiment of the present application provides an electronic device comprising a processor and a memory storing a program or instructions executable on the processor, which when executed by the processor, implement the steps of the method as described in the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium having stored thereon a program or instructions which when executed by a processor perform the steps of the method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and where the processor is configured to execute a program or instructions to implement a method according to the first aspect.
In a sixth aspect, embodiments of the present application provide a computer program product stored in a storage medium, the program product being executable by at least one processor to implement the method according to the first aspect.
In the embodiment of the application, a first input to a game interface is received, the video interface is displayed on the same screen as the game interface in response to the first input, the video interface comprises at least one game video identifier, the game video corresponding to the at least one game video identifier is obtained by searching according to game information, the game information corresponds to the current game progress, a second input to a target game video identifier in the at least one game video identifier is received, and the target game video corresponding to the target game video identifier is played on the video interface in response to the second input, so that a user can view the game attack video on the same screen while operating the game on the game interface, the two tasks are parallel without switching back and forth between a game application and the attack search application, and the tedious operation of viewing the game attack video in the game process is avoided.
Drawings
FIG. 1 is a flow chart of a game interaction method provided by an embodiment of the application;
FIG. 2 is one of exemplary diagrams of a game interaction method provided by an embodiment of the present application;
FIG. 3 is a second exemplary diagram of a game interaction method provided by an embodiment of the present application;
FIG. 4 is a block diagram of a game interaction device according to an embodiment of the present application;
Fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 6 is a schematic diagram of a hardware structure of an electronic device implementing various embodiments of the application.
Detailed Description
The technical solutions of the embodiments of the present application will be clearly described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which are obtained by a person skilled in the art based on the embodiments of the present application, fall within the scope of protection of the present application.
The terms first, second and the like in the description and in the claims, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the terms so used are interchangeable under appropriate circumstances such that the embodiments of the application are capable of operation in sequences other than those illustrated or otherwise described herein, and that the objects identified by "first," "second," etc. are generally of a type not limited to the number of objects, for example, the first object may be one or more. Furthermore, in the description and claims, "and/or" means at least one of the connected objects, and the character "/", generally means that the associated object is an "or" relationship.
The embodiment of the application provides a game interaction method and device and electronic equipment. The game interaction method provided by the embodiment of the application is described in detail below with reference to the accompanying drawings.
It should be noted that, the game interaction method provided by the embodiment of the present application may be applied to an electronic device, and in practical application, the electronic device may include: mobile terminals such as smart phones and tablet computers, the embodiment of the present application is not limited thereto.
Fig. 1 is a flowchart of a game interaction method according to an embodiment of the present application, as shown in fig. 1, the method may include the following steps: step 101, step 102, step 103 and step 104;
in step 101, a first input to a game interface is received.
In the embodiment of the application, the game interface is the interface of the game currently running on the screen of the electronic equipment.
In an embodiment of the application, the first input is used to trigger the search of a game attack video matching the currently running game interface and to display a video interface with an identification of the game attack video.
In the embodiment of the application, the marks are words, symbols, images and the like used for indicating information, and controls or other containers can be used as carriers for displaying information, including but not limited to word marks, symbol marks and image marks.
In the embodiment of the present application, the first input may be a gesture input, for example, a circle is drawn on the game interface, or the screen edge slides inward; alternatively, the first input may be a click input to a particular control on the game interface, where the particular control is used to trigger a search for a game play video that matches the currently running game interface. For ease of description, the description will follow with "game video" instead of "game attack video".
In step 102, in response to a first input, displaying a video interface on-screen with a game interface; the video interface comprises at least one game video identifier, and the game video corresponding to the at least one game video identifier is obtained by searching according to game information, wherein the game information corresponds to the current game progress.
In the embodiment of the application, after receiving the first input, the intention of the user to search the game video matched with the current game progress of the game interface is determined, at the moment, the background is triggered to acquire the game information of the game interface, the game video matched with the game information is searched from a network platform or a pre-constructed video library according to the game information, and the game video identifier corresponding to the game video is displayed on the video interface for the user to select without exiting the game application.
In the embodiment of the application, two independent windows can be displayed in parallel on the screen of the electronic equipment, and a game interface and a video interface are respectively displayed in the two windows; or, two independent windows can be displayed on the screen of the electronic device in an overlapping manner, at this time, one window is displayed in a full screen mode, the other window is displayed in a small window mode, the game interface is displayed in the window displayed in the full screen mode, and the video interface is displayed in the window displayed in the small window mode.
In the embodiment of the application, the video interface and the game interface are displayed on the same screen, and the game video identifier corresponding to the current progress of the game is displayed on the video interface, so that a user can select the game video which is expected to correspond to the current progress of the game on the video interface while operating the game on the game interface, and the purposes of playing the game and searching for the game play are achieved without delay.
In the embodiment of the application, the game video can be searched from the network platform, and the searched game video is sourced from the whole network, so that the search result is relatively comprehensive. In addition, in order to facilitate the user to view, the game video under the same channel can be displayed in an aggregated manner according to the network source channel, for example, the video interface comprises a plurality of primary labels, each primary label corresponds to one network source channel, and the user clicks the primary label and can view the game video identification under the network source channel corresponding to the label.
In the embodiment of the application, each game manufacturer can divide the developed games into a plurality of level tasks in advance, obtain the game information corresponding to each level task, construct the identification information of the level task, namely the level task identification, and simultaneously generate the game video corresponding to each level task, thereby constructing the video library corresponding to each game, and search matched game videos from the pre-constructed video library, so that on one hand, the search response speed is higher, and on the other hand, the targeted search is carried out, and the search result is more accurate. In addition, the video library can support the user to upload game videos edited by the user so as to continuously enrich the video content in the video library.
In step 103, a second input is received of a target game video identification of the at least one game video identification.
In the embodiment of the application, the second input is used for selecting the target game video identifier from at least one game video identifier of the video interface and triggering to play the target game video corresponding to the target game video identifier. In practice, the second input may be a click input to the target game video identification.
In step 104, in response to the second input, a target game video corresponding to the target game video identification is played at the video interface.
In the embodiment of the application, the video interface and the game interface are displayed on the same screen, and the game video is played on the video interface, so that a user can watch the relevant game video corresponding to the current progress of the game while operating the game, and the user does not need to switch back and forth between the game application and the attack searching application, thereby avoiding the tedious operation of viewing the attack video of the game in the game process.
In the embodiment, a first input to the game interface is received, the video interface is displayed on the same screen as the game interface in response to the first input, the video interface comprises at least one game video identifier, the game video corresponding to the at least one game video identifier is obtained by searching according to game information, the game information corresponds to the current game progress, a second input to a target game video identifier in the at least one game video identifier is received, and the target game video corresponding to the target game video identifier is played on the video interface in response to the second input, so that a user can view the game attack video on the same screen while operating the game on the game interface, the two tasks are parallel without switching back and forth between a game application and the attack search application, and complex operation of viewing the game attack video in the game process is avoided.
In some embodiments provided by the present application, the game information of the game interface may include at least one of: game pictures, game audio, game subtitles, game progress information.
In the embodiment of the application, the game caption usually contains some guiding characters, such as 'know about nearby condition to thousand rock armies', and the guiding characters can reflect the current game progress of the game interface, so that the game caption of the game interface can be used as one of the game information of the game interface. In addition, considering that the game caption of the game interface is directly taken as the game information of the game interface, there may be redundancy of information, which is unfavorable for the subsequent search of the game video based on the game information, so that some keywords can be extracted from the game caption of the game interface, and the extracted keywords are determined as one of the game information of the game interface.
In the embodiment of the application, because the game audio generally contains some music or dialogue information, and the music or dialogue information can also reflect the current game progress of the game interface, the game audio of the game interface can be used as one of the game information of the game interface.
In the embodiment of the application, since the game picture generally contains some scene or character information, and the scene or character information can also represent the current game progress of the game interface, the game picture of the game interface can be used as one of the game information of the game interface.
In the embodiment of the application, the game progress information of the game level, the game node and the like is directly related to the game progress, so that the game progress information of the game interface can be used as one of the game information of the game interface.
In some embodiments, considering that the acquisition cost and the recognition cost of the subtitles are relatively low, for a game interface with subtitles, in order to improve the searching speed and reduce the system resource amount occupied by searching, a game video may be searched according to the game subtitles of the game interface, for example, some keywords may be extracted from the subtitles of the game interface, and the keywords are used as search words to search the game video from a network platform or a video library.
In some embodiments, considering that some keyword information is also included in the game audio, in order to ensure that the search result is more comprehensive for the game interface where the subtitle exists, the game video may be searched according to the game subtitle and the game audio of the game interface, for example, some keywords may be extracted from the subtitle of the game interface, the game audio may be recognized as text, some keywords may be extracted from the text, keywords in the subtitle dimension and keywords in the audio dimension may be used as search words, and the game video may be searched from the network platform or the video library.
In some embodiments, considering that a part of game applications cannot show subtitles on a game interface, or that after the typesetting layout of the game applications changes, there may be a problem of inaccurate identification, for a game interface without subtitles or a game interface with a changed typesetting layout, a game video may be searched according to a game picture and/or a game audio of the game interface, for example, keywords of the game picture may be identified through an image recognition technology, keywords of the game audio may be identified through a voice recognition technology, and keywords of a picture dimension and/or keywords of a voice dimension may be used as search words to search a game video from a network platform or a video library; or, the image characteristics (such as the number of objects in the image, the outline, the typesetting layout of the objects, etc.) of the game image can be identified through the image recognition technology, the audio characteristics (such as the played music names, the audio tone of the characters, etc.) of the game audio can be identified through the sound recognition technology, the corresponding gate task names of the game interface are determined based on the image characteristics and/or the audio characteristics, and the game video is searched from the network platform or the video library based on the gate task names, so that the game attack searching of which the task names are not displayed on the game interface can be supported, and the application scene is wider.
In some embodiments, considering that a game is generally composed of a plurality of level tasks, that is, after a user successfully passes through one game level task, the user enters another level task of the game, and so on, and the game progress information can characterize the level task of the game in progress, the level task identification corresponding to the game interface can be determined according to the game progress information, and game videos can be searched from the network platform or the video library based on the level task names.
Therefore, in the embodiment of the application, the game video can be searched from one dimension or a plurality of dimensions in the game picture, the game audio, the game subtitle and the game progress information, so that the diversified searching requirements of users can be met on one hand, the method and the device can be applied to various game scenes and the application scene is wider on the other hand.
In some embodiments provided by the present application, the step 102 may include the following steps: step 1021;
in step 1021, in response to the first input, a hover window is displayed at an upper layer of the game interface, and a video interface is displayed through the hover window.
In the embodiment of the application, in order to reduce the interference to the game operation of the user, a floating window can be displayed on the upper layer of the game interface, and the video interface is displayed through the floating window, so that the same screen display of the game interface and the video interface is realized.
In the embodiment of the application, the display position of the floating window can be set by user definition, for example, the floating window is set at the edge position of the left side of the screen. Alternatively, the floating window may be displayed at an edge position on the right side of the screen according to the usage habits of most users, so that the users can watch the game video while operating the game.
In the embodiment of the application, after the floating window is displayed on the upper layer of the game interface, the user can move the display position of the floating window and also can enlarge or reduce the size of the floating window so as to enlarge or reduce the video interface.
In one example, as shown in fig. 2, when a user wants to view a game video while operating the game on the game interface, a sliding gesture from the edge of the screen to the inner side of the screen can be made on the game interface by a finger, to trigger displaying a floating window on the upper layer of the game interface, wherein the floating window displays a video interface including a plurality of game video identifications, namely, "1, XX video", "2, YY video" and "3, ZZ video", respectively, and the user clicks any one of the game video identifications, for example, "1, XX video", to trigger playing the inner XX video on the video interface of the floating window, so that the user can operate the game on the game interface while viewing the game video in the floating window.
In some embodiments provided by the present application, game information for a game interface includes: game pictures, game audio, game subtitles, and game progress information; accordingly, the game interaction method provided can be added with the following steps on the basis of the embodiment shown in fig. 1: step 105 and step 106;
in step 105, game characteristics of the game interface are determined based on the game screen, game audio, game subtitle, and game progress information corresponding to the current game progress.
In the embodiment of the application, the situation that the game picture and the game scenario are complex and cannot be simply described by language and are not searched by text retrieval is considered, so that the game characteristics of the game interface can be determined according to the information of a plurality of dimensions such as the game picture, the game audio, the game caption, the game progress information and the like, and the game video is searched based on the game characteristics of the game interface.
In an embodiment of the present application, the game features may include: scene features, character features, music features, task gate features, etc.
In step 106, according to the game features of the game interface, similarity calculation is performed with the game features of each video of the video platform, and the video with similarity greater than the similarity threshold is determined as the game video corresponding to the at least one game video identifier.
In the embodiment of the application, the determination mode of the game features of each video of the video platform is the same as the determination mode of the game features of the game interface.
In the embodiment of the application, the higher the similarity of the two objects is, the higher the matching degree of the two objects is, so that the video with the similarity of the video platform larger than the similarity threshold value is determined as the game video corresponding to at least one game video identifier.
Therefore, in the embodiment of the application, the game characteristics of the game interface can be determined according to the information of multiple dimensions such as the game picture, the game audio, the game caption, the game progress information and the like, and the game video is searched based on the game characteristics of the game interface, so that the search result is more comprehensive and accurate.
In some embodiments provided by the present application, game play notes may be automatically generated based on target game videos played on a video interface and intelligent clipping algorithms.
In some embodiments of the present application, a user may be supported to manually generate game attack notes based on a target game video played on a video interface, and accordingly, the game interaction method provided may further include, after the step 104, the following steps on the basis of the embodiment shown in fig. 1: step 107, step 108 and step 109;
In step 107, a third input to the video interface is received.
In the embodiment of the application, the third input is used for triggering the acquisition of the target video frame image from the target game video, and in practical application, the third input can be click input of a screen capturing control on a video interface, click input of a video picture and the like.
In step 108, in response to the third input, a target video frame image is obtained from the target game video played by the video interface.
For example, during the playing of the target game video, if the user clicks the screen capturing control, the video picture being played at the time of clicking by the user is determined as the target video frame image. Alternatively, during the playing of the target game video, if the user clicks on the video picture being played, the video picture is determined as the target video frame image.
In step 109, game play notes are generated from the target video frame image.
In the embodiment of the application, the format of the game attack notes can be video, presentation, and the like.
In some embodiments, the user may edit the target video frame image, generate a game play note based on the edited target video frame image, and accordingly, the step 109 may include the steps of: step 1091, step 1092, and step 1093;
In step 1091, a fourth input is received of a target video frame image.
In the embodiment of the present application, the fourth input is used for editing the target video frame image, where the editing mode may include at least one of the following: performing a zoom operation on the image, intercepting a key area, adjusting the hue of the image, editing an object in the image, and the like.
In step 1092, the target video frame image is edited in response to the fourth input.
In step 1093, game play notes are generated from the edited target video frame image.
According to the embodiment of the application, a user can be supported to edit the target video frame image, the game attack note is generated based on the edited target video frame image, and the content of the video picture can be highlighted by editing the target video frame image for the second time, so that the generated game attack note is more refined and the center content is more highlighted.
In some embodiments, the user may add text description information to the target video frame image, and generate a game play note according to the target video frame image and the text description information, and accordingly, the step 109 may include the steps of: step 1094, step 1095, and step 1096;
In step 1094, a fifth input is received.
In the embodiment of the present application, the fifth input is used for inputting text description information, and the fifth input may be text input triggered on a text box, where each target video frame image may correspond to one text box respectively, or all target video frame images share one text box.
In step 1095, text description information is added to the target video frame image in response to the fifth input.
In step 1096, game play notes are generated from the target video frame images and the text description information.
According to the embodiment of the application, a user can be supported to add text description information to the target video frame image, game attack notes are generated according to the target video frame image and the text description information, and the content of a video picture can be expressed more accurately by adding the text description information, so that the generated game attack notes are richer and more understandable.
In one example, as shown in fig. 3, a floating window is displayed at a right position of an upper layer of the game interface, a video interface is displayed in the floating window, and during playing of a target game video on the video interface, a user can click on a screen capturing control at the upper right side of the video interface, that is, a video picture currently being played by the target game video can be added into an image frame below the video interface, for example, the image frame displays a video frame image 1, a video frame image 2 and a video frame image 3 which are already captured respectively. The user can click the video frame image which is already intercepted in the image frame for editing, and can also input text description information in a text box below the image frame, so that the text description information is added for the video frame image in the image frame, and after the video frame image editing and the text description information adding are completed, a completion control is clicked to generate a game attack note.
Therefore, in the embodiment of the application, the secondary editing of the game video by the user can be supported, and the game attack note is generated, and because the game attack note is edited by the user in a self-defined way, the generated game attack note meets the watching requirements of most game users.
In some embodiments of the present application, the game interaction method provided may further include the following steps after the step 109: step 110;
in step 110, a game play note is played, wherein the play progress of the game play note corresponds to the game progress.
In the embodiment of the application, in the process of playing the game attack note, the user can be supported to manually adjust the playing progress of the game attack note, for example, the side surface of the playing interface can be provided with a sliding control, and the user can manually adjust the playing progress of the game attack note through the sliding control.
In the embodiment of the application, after the game attack note is generated, in the subsequent game process, the game attack note can be matched through game pictures, game captions and the like, and played along with the game progress of the user, so that the step of searching the game attack by the user is omitted, and the convenience of game interaction is improved.
In some embodiments of the present application, the game interaction method provided may further include the following steps after the step 109: step 120;
in step 120, game play notes are shared.
In the embodiment of the application, after the game attack notes are generated, the user can share the game attack notes to game friends; alternatively, the user may also share the game play notes on the gaming platform, and recommend the game play notes that the user shares when other gamers arrive at the game progress. In addition, the player may take notes for each game or score, and determine the recommended ranking based on the number of notes or the size of the score.
According to the game interaction method provided by the embodiment of the application, the execution subject can be a game interaction device. In the embodiment of the application, the game interaction device provided by the embodiment of the application is described by taking a game interaction method executed by the game interaction device as an example.
Fig. 4 is a block diagram of a game interaction device according to an embodiment of the present application, and as shown in fig. 4, a game interaction device 400 may include: a first receiving module 401, a display module 402, a second receiving module 403, and a first playing module 404;
A first receiving module 401 for receiving a first input to the game interface;
a display module 402, configured to respond to the first input, and display a video interface on the same screen as the game interface, where the video interface includes at least one game video identifier, where a game video corresponding to the at least one game video identifier is obtained by searching according to game information, and the game information corresponds to a current game progress;
a second receiving module 403 for receiving a second input of a target game video identification of the at least one game video identification;
and the first playing module 404 is configured to respond to the second input, and play the target game video corresponding to the target game video identifier on the video interface.
In the embodiment, a first input to the game interface is received, the video interface is displayed on the same screen as the game interface in response to the first input, the video interface comprises at least one game video identifier, the game video corresponding to the at least one game video identifier is obtained by searching according to game information, the game information corresponds to the current game progress, a second input to a target game video identifier in the at least one game video identifier is received, and the target game video corresponding to the target game video identifier is played on the video interface in response to the second input, so that a user can view the game attack video on the same screen while operating the game on the game interface, the two tasks are parallel without switching back and forth between a game application and the attack search application, and complex operation of viewing the game attack video in the game process is avoided.
Alternatively, as an embodiment, the display module 402 may include:
and the display sub-module is used for responding to the first input, displaying a floating window on the upper layer of the game interface and displaying a video interface through the floating window.
Alternatively, as an embodiment, the game information may include at least one of: game pictures, game audio, game subtitles, game progress information.
Optionally, as an embodiment, the game information includes: game pictures, game audio, game subtitles, and game progress information;
the game interaction device 400 may further include:
the first determining module is used for determining game characteristics of the game interface according to game pictures, game audios, game subtitles and game progress information corresponding to the current game progress;
and the second determining module is used for calculating the similarity with the game characteristics of each video of the video platform according to the game characteristics of the game interface, and determining the video with the similarity larger than the similarity threshold value as the game video corresponding to the at least one game video identifier.
Optionally, as an embodiment, the game interaction device 400 may further include:
A third receiving module for receiving a third input to the video interface;
the acquisition module is used for responding to the third input and acquiring a target video frame image from the target game video played by the video interface;
and the generation module is used for generating game attack notes according to the target video frame image.
Alternatively, as an embodiment, the generating module may include:
a first receiving sub-module for receiving a fourth input of the target video frame image;
an editing sub-module, configured to edit the target video frame image in response to the fourth input;
and the first generation sub-module is used for generating game attack notes according to the edited target video frame image.
Alternatively, as an embodiment, the generating module may include:
a second receiving sub-module for receiving a fifth input;
an adding sub-module, configured to add text description information to the target video frame image in response to the fifth input;
and the second generation sub-module is used for generating game attack notes according to the target video frame image and the text description information.
Optionally, as an embodiment, the game interaction device 400 may further include:
And the second playing module is used for playing the game attack notes, wherein the playing progress of the game attack notes corresponds to the game progress.
The game interaction device in the embodiment of the application can be electronic equipment or a component in the electronic equipment, such as an integrated circuit or a chip. The electronic device may be a terminal, or may be other devices than a terminal. By way of example, the electronic device may be a mobile phone, tablet, notebook, palmtop, vehicle-mounted electronic device, mobile internet appliance (Mobile Internet Device, MID), augmented Reality (Augmented Reality, AR)/Virtual Reality (VR) device, robot, wearable device, ultra-Mobile Personal Computer, UMPC, netbook or personal digital assistant (Personal Digital Assistant, PDA), etc., but may also be a server, network attached storage (Network Attached Storage, NAS), personal computer (Personal Computer, PC), television (Television, TV), teller machine or self-service machine, etc., and the embodiments of the present application are not limited in particular.
The game interaction device in the embodiment of the application can be a device with an operating system. The operating system may be an Android operating system, an ios operating system, or other possible operating systems, and the embodiment of the present application is not limited specifically.
The game interaction device provided by the embodiment of the present application can implement each process implemented by the embodiment of the method shown in fig. 1, and in order to avoid repetition, details are not repeated here.
Optionally, as shown in fig. 5, the embodiment of the present application further provides an electronic device 500, including a processor 501 and a memory 502, where the memory 502 stores a program or an instruction that can be executed on the processor 501, and the program or the instruction implements each step of the above-mentioned game interaction method embodiment when executed by the processor 501, and the steps achieve the same technical effects, so that repetition is avoided, and no further description is given here.
The electronic device in the embodiment of the application includes the mobile electronic device and the non-mobile electronic device.
Fig. 6 is a schematic diagram of a hardware structure of an electronic device implementing various embodiments of the application.
The electronic device 600 includes, but is not limited to: radio frequency unit 601, network module 602, audio output unit 603, input unit 604, sensor 605, display unit 606, user input unit 607, interface unit 608, memory 609, and processor 610.
Those skilled in the art will appreciate that the electronic device 600 may further include a power source (e.g., a battery) for powering the various components, which may be logically connected to the processor 610 by a power management system to perform functions such as managing charge, discharge, and power consumption by the power management system. The electronic device structure shown in fig. 6 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than shown, or may combine certain components, or may be arranged in different components, which are not described in detail herein.
A user input unit 607 for receiving a first input to a game interface;
a display unit 606, configured to respond to the first input, and display a video interface on the same screen as the game interface, where the video interface includes at least one game video identifier, where a game video corresponding to the at least one game video identifier is obtained by searching according to game information, and the game information corresponds to a current game progress;
a user input unit 607 further for receiving a second input of a target game video identification of said at least one game video identification;
and a processor 610, configured to play, in response to the second input, a target game video corresponding to the target game video identifier on the video interface.
Therefore, in the embodiment of the application, a user can view the game attack video on the same screen while operating the game on the game interface, and the two tasks are parallel, so that the game application and the attack search application do not need to be switched back and forth, and the tedious operation of viewing the game attack video in the game process is avoided.
Optionally, as an embodiment, the display unit 606 is further configured to display a floating window on an upper layer of the game interface in response to the first input, and display a video interface through the floating window.
Optionally, as an embodiment, the game information includes at least one of: game pictures, game audio, game subtitles, game progress information.
Optionally, as an embodiment, the game information includes: game pictures, game audio, game subtitles, and game progress information;
the processor 610 is further configured to determine a game feature of the game interface according to a game picture, a game audio, a game subtitle, and game progress information corresponding to a current game progress; and according to the game characteristics of the game interface, performing similarity calculation with the game characteristics of each video of the video platform, and determining the video with the similarity larger than a similarity threshold as the game video corresponding to the at least one game video identifier.
Optionally, as an embodiment, the user input unit 607 is further configured to receive a third input to the video interface;
a processor 610 further configured to obtain a target video frame image from the target game video played by the video interface in response to the third input; and generating game attack notes according to the target video frame image.
Optionally, as an embodiment, the user input unit 607 is further configured to receive a fourth input to the target video frame image;
A processor 610 further configured to edit the target video frame image in response to the fourth input; and generating game attack notes according to the edited target video frame image.
Optionally, as an embodiment, the user input unit 607 is further configured to receive a fifth input;
a processor 610 further configured to add text description information to the target video frame image in response to the fifth input; and generating game attack notes according to the target video frame image and the text description information.
Optionally, as an embodiment, the processor 610 is further configured to play the game attack note, where a play progress of the game attack note corresponds to a game progress.
It should be understood that in an embodiment of the present application, the input unit 604 may include a graphics processor (Graphics Processing Unit, GPU) 6041 and a microphone 6042, and the graphics processor 6041 processes image data of still pictures or video obtained by an image capturing apparatus (e.g., a camera) in a video capturing mode or an image capturing mode. The display unit 606 may include a display panel 6061, and the display panel 6061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 607 includes at least one of a touch panel 6071 and other input devices 6072. The touch panel 6071 is also called a touch screen. The touch panel 6071 may include two parts of a touch detection device and a touch controller. Other input devices 6072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and so forth, which are not described in detail herein.
The memory 609 may be used to store software programs as well as various data. The memory 609 may mainly include a first storage area storing programs or instructions and a second storage area storing data, wherein the first storage area may store an operating system, application programs or instructions (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like. Further, the memory 609 may include volatile memory or nonvolatile memory, or the memory 609 may include both volatile and nonvolatile memory. The nonvolatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable EPROM (EEPROM), or a flash Memory. The volatile memory may be random access memory (Random Access Memory, RAM), static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (ddr SDRAM), enhanced SDRAM (Enhanced SDRAM), synchronous DRAM (SLDRAM), and Direct RAM (DRRAM). Memory 609 in embodiments of the present application includes, but is not limited to, these and any other suitable types of memory.
The processor 610 may include one or more processing units; optionally, the processor 610 integrates an application processor that primarily processes operations involving an operating system, user interface, application programs, and the like, and a modem processor that primarily processes wireless communication signals, such as a baseband processor. It will be appreciated that the modem processor described above may not be integrated into the processor 610.
The embodiment of the application also provides a readable storage medium, on which a program or an instruction is stored, which when executed by a processor, implements each process of the above game interaction method embodiment, and can achieve the same technical effect, so that repetition is avoided, and no further description is given here.
Wherein the processor is a processor in the electronic device described in the above embodiment. The readable storage medium includes computer readable storage medium such as computer readable memory ROM, random access memory RAM, magnetic or optical disk, etc.
The embodiment of the application further provides a chip, which comprises a processor and a communication interface, wherein the communication interface is coupled with the processor, and the processor is used for running programs or instructions to realize the processes of the game interaction method embodiment, and the same technical effects can be achieved, so that repetition is avoided, and the description is omitted here.
It should be understood that the chips referred to in the embodiments of the present application may also be referred to as system-on-chip chips, chip systems, or system-on-chip chips, etc.
Embodiments of the present application provide a computer program product stored in a storage medium, where the program product is executed by at least one processor to implement the respective processes of the above-described game interaction method embodiments, and achieve the same technical effects, and for avoiding repetition, a detailed description is omitted herein.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Furthermore, it should be noted that the scope of the methods and apparatus in the embodiments of the present application is not limited to performing the functions in the order shown or discussed, but may also include performing the functions in a substantially simultaneous manner or in an opposite order depending on the functions involved, e.g., the described methods may be performed in an order different from that described, and various steps may be added, omitted, or combined. Additionally, features described with reference to certain examples may be combined in other examples.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a computer software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) comprising instructions for causing a terminal (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the method according to the embodiments of the present application.
The embodiments of the present application have been described above with reference to the accompanying drawings, but the present application is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those having ordinary skill in the art without departing from the spirit of the present application and the scope of the claims, which are to be protected by the present application.

Claims (10)

1. A method of game interaction, the method comprising:
receiving a first input to a game interface;
responding to the first input, displaying a video interface on the same screen as the game interface, wherein the video interface comprises at least one game video identifier, and the game video corresponding to the at least one game video identifier is obtained by searching according to game information, and the game information corresponds to the current game progress;
receiving a second input of a target game video identification of the at least one game video identification;
and responding to the second input, and playing the target game video corresponding to the target game video identifier on the video interface.
2. The method of claim 1, wherein the displaying a video interface on-screen with the game interface in response to the first input comprises:
and responding to the first input, displaying a floating window on the upper layer of the game interface, and displaying a video interface through the floating window.
3. The method of claim 1, wherein the game information comprises at least one of: game pictures, game audio, game subtitles, game progress information.
4. The method of claim 1, wherein the game information comprises: game pictures, game audio, game subtitles, and game progress information;
the method further comprises the steps of:
determining game characteristics of the game interface according to game pictures, game audios, game subtitles and game progress information corresponding to the current game progress;
and according to the game characteristics of the game interface, performing similarity calculation with the game characteristics of each video of the video platform, and determining the video with the similarity larger than a similarity threshold as the game video corresponding to the at least one game video identifier.
5. The method of claim 1, wherein after the video interface plays the target game video corresponding to the target game video identifier, the method further comprises:
receiving a third input to the video interface;
responsive to the third input, obtaining a target video frame image from the target game video played by the video interface;
and generating game attack notes according to the target video frame image.
6. The method of claim 5, wherein generating game play notes from the target video frame images comprises:
Receiving a fourth input of the target video frame image;
editing the target video frame image in response to the fourth input;
and generating game attack notes according to the edited target video frame image.
7. The method of claim 5, wherein generating game play notes from the target video frame images comprises:
receiving a fifth input;
in response to the fifth input, adding text description information for the target video frame image;
and generating game attack notes according to the target video frame image and the text description information.
8. The method of claim 5, wherein after generating the game play note from the target video frame image, the method further comprises:
and playing the game attack note, wherein the playing progress of the game attack note corresponds to the game progress.
9. A game interaction device, the device comprising:
a first receiving module for receiving a first input to the game interface;
the display module is used for responding to the first input, displaying a video interface with the game interface on the same screen, wherein the video interface comprises at least one game video identifier, the game video corresponding to the at least one game video identifier is obtained by searching according to game information, and the game information corresponds to the current game progress;
A second receiving module for receiving a second input of a target game video identification of the at least one game video identification;
and the first playing module is used for responding to the second input and playing the target game video corresponding to the target game video identifier on the video interface.
10. An electronic device comprising a processor and a memory storing a program or instructions executable on the processor, which when executed by the processor, implement the steps of the game interaction method of any of claims 1 to 8.
CN202311333976.6A 2023-10-13 2023-10-13 Game interaction method and device and electronic equipment Pending CN117224942A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311333976.6A CN117224942A (en) 2023-10-13 2023-10-13 Game interaction method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311333976.6A CN117224942A (en) 2023-10-13 2023-10-13 Game interaction method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN117224942A true CN117224942A (en) 2023-12-15

Family

ID=89087939

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311333976.6A Pending CN117224942A (en) 2023-10-13 2023-10-13 Game interaction method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN117224942A (en)

Similar Documents

Publication Publication Date Title
JP2020504475A (en) Providing related objects during video data playback
CN112437353B (en) Video processing method, video processing device, electronic apparatus, and readable storage medium
CN113836400A (en) Keyword search method and system using communication tool service
CN113254779A (en) Content search method, device, equipment and medium
CN111414532B (en) Information recommendation method, equipment and machine-readable storage medium
CN107515870B (en) Searching method and device and searching device
CN113407828A (en) Searching method, searching device and searching device
CN110020106B (en) Recommendation method, recommendation device and device for recommendation
CN112860921A (en) Information searching method and device
CN112328829A (en) Video content retrieval method and device
WO2023169361A1 (en) Information recommendation method and apparatus and electronic device
CN111752436A (en) Recommendation method and device and recommendation device
CN111954076A (en) Resource display method and device and electronic equipment
WO2022183967A1 (en) Video picture display method and apparatus, and device, medium and program product
CN113905125B (en) Video display method and device, electronic equipment and storage medium
US20140181672A1 (en) Information processing method and electronic apparatus
CN117224942A (en) Game interaction method and device and electronic equipment
CN115309487A (en) Display method, display device, electronic equipment and readable storage medium
CN114125149A (en) Video playing method, device, system, electronic equipment and storage medium
CN113965792A (en) Video display method and device, electronic equipment and readable storage medium
CN113010072A (en) Searching method and device, electronic equipment and readable storage medium
CN114745587B (en) Multimedia resource playing method and device, electronic equipment and medium
CN111079051B (en) Method and device for playing display content
WO2024113679A1 (en) Multimedia resource processing method and apparatus, and device
CN113177139A (en) Interaction method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination