CN111569436A - Processing method, device and equipment based on interaction in live broadcast fighting - Google Patents

Processing method, device and equipment based on interaction in live broadcast fighting Download PDF

Info

Publication number
CN111569436A
CN111569436A CN202010393726.1A CN202010393726A CN111569436A CN 111569436 A CN111569436 A CN 111569436A CN 202010393726 A CN202010393726 A CN 202010393726A CN 111569436 A CN111569436 A CN 111569436A
Authority
CN
China
Prior art keywords
drawing information
target drawing
game scene
spectator
touch operation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010393726.1A
Other languages
Chinese (zh)
Inventor
张杨华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202010393726.1A priority Critical patent/CN111569436A/en
Publication of CN111569436A publication Critical patent/CN111569436A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/85Providing additional services to players
    • A63F13/86Watching games played by other players
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/85Providing additional services to players
    • A63F13/87Communicating with other players during game play, e.g. by e-mail or chat
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4126The peripheral being portable, e.g. PDAs or mobile phones
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4318Generation of visual interfaces for content selection or interaction; Content or additional data rendering by altering the content in the rendering process, e.g. blanking, blurring or masking an image region
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • A63F2300/57Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers details of game services offered to the player
    • A63F2300/572Communication between players during game play of non game information, e.g. e-mail, chat, file transfer, streaming of audio and streaming of video
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • A63F2300/57Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers details of game services offered to the player
    • A63F2300/577Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers details of game services offered to the player for watching a game played by other players

Abstract

The embodiment of the invention provides a processing method, a device and equipment based on interaction in live broadcast fighting, wherein the method provides a graphical user interface through a fighting client, and the graphical user interface displays a game scene, and the method specifically comprises the following steps: the first spectator client responds to touch operation acting on the game scene to obtain first target drawing information corresponding to the touch operation and performs rendering display in the game scene according to the first target drawing information, and the first spectator client sends the first target drawing information to the server so that the server synchronizes the first target drawing information to at least one second spectator client and performs rendering display processing according to the first target drawing information. The accuracy of semantic communication in the communication process is improved, and the participation experience of the fighting users is further improved.

Description

Processing method, device and equipment based on interaction in live broadcast fighting
Technical Field
The embodiment of the invention relates to the technical field of data processing, in particular to a processing method, a device and equipment based on interaction in live broadcast fighting.
Background
With the development of internet technology, online games are developing more and more rapidly, and in order to improve the enthusiasm of users for participating in games, a plurality of clients can participate in games simultaneously in a live game mode.
When the game is directly played, the participating users are divided into two types, one is a player user participating in game fighting, and the other is a spectator user watching the game fighting process. The client corresponding to the player user may be referred to as a player client, and the client corresponding to the spectator user may be referred to as a spectator client.
The spectator fighting user may have some own ideas and game strategies to want to communicate with other spectator fighting users in the process of watching the game and fighting, however, in the prior art, the spectator fighting user can only communicate with the spectator fighting client through character forms such as chat channels or barrage and the like, the communication mode is not enough to accurately express the ideas and strategies of the spectator fighting user, other spectator fighting users understand that deviation may be caused, the meaning that the spectator fighting user wants to express cannot be accurately mastered, therefore, the accuracy of the communication process cannot be ensured, and further the participation experience of the spectator fighting user is reduced.
Disclosure of Invention
The embodiment of the invention provides a processing method, a device and equipment based on interaction in live broadcast fighting, and aims to improve the accuracy of semantic transmission in a communication process.
In a first aspect, an embodiment of the present invention provides a processing method based on interaction in live fighting, where a graphical user interface is provided by a fighting client, where the graphical user interface displays a game scene, and the method includes: the first spectator client responds to touch operation acting on a game scene to acquire first target drawing information corresponding to the touch operation, and rendering and displaying are carried out in the game scene according to the first target drawing information; and the first spectator client sends the first target drawing information to a server so that the server synchronizes the first target drawing information to at least one second spectator client, and the rendering display processing is carried out according to the first target drawing information.
Optionally, the first spectator client responds to a touch operation applied to a game scene to acquire first target drawing information corresponding to the touch operation, and includes: the first fighting client identifies two-dimensional coordinate position information of each touch position under touch operation; and the first spectator client sequentially converts the two-dimensional coordinate position information of each touch position into three-dimensional coordinate position information applied to the game scene, and uses the sequentially acquired three-dimensional coordinate position information as the first target drawing information.
Optionally, if the graphical user interface includes a start key, the first spectator client responds to a touch operation applied to a game scene to obtain first target drawing information corresponding to the touch operation, including: and when the first spectator client identifies the first touch operation of the start key, responding to the touch operation acted in the game scene to acquire first target drawing information corresponding to the touch operation.
Optionally, the graphical user interface further includes a set button, and then the method further includes: and the first spectator client responds to a second touch operation acting on the setting key to update the drawing parameters corresponding to the setting key, wherein the drawing parameters are at least one or a combination of line thickness, line shape and line color.
Optionally, the rendering and displaying processing in the game scene according to the first target drawing information includes: rendering in the game scene according to the first target drawing information and the drawing parameters to display the shape corresponding to the first target drawing information, wherein the parameters of the shape conform to the drawing parameters.
Optionally, the method further includes: and the first spectator client acquires second target drawing information sent by a second spectator client synchronized with the server, and performs rendering display processing in the game scene according to the second target drawing information.
Optionally, the rendering and displaying in the game scene according to the second target drawing information includes: analyzing the second target drawing information, and determining three-dimensional coordinate position information and a corresponding sequence of each position; rendering in the game scene according to the three-dimensional coordinate position information of each position and the corresponding sequence so as to display the shape corresponding to the second target drawing information.
Optionally, the rendering and displaying in the game scene according to the second target drawing information includes: rendering in the game scene according to the second target drawing information and the drawing parameters to display a shape corresponding to the second target drawing information, wherein the parameters of the shape conform to the drawing parameters.
In a second aspect, an embodiment of the present invention provides a processing apparatus based on interaction in live fighting, which provides a graphical user interface through a fighting client, where the graphical user interface displays a game scene, and includes: the touch control module is used for responding to touch operation acted in a game scene by the first spectator client so as to obtain first target drawing information corresponding to the touch operation and performing rendering display in the game scene according to the first target drawing information; and the processing module is used for sending the first target drawing information to the server side by the first spectator client side so as to enable the server side to synchronize the first target drawing information to at least one second spectator client side, and then performing rendering display processing according to the first target drawing information.
In a third aspect, an embodiment of the present invention provides a processing device based on interaction in live broadcast fighting, including: at least one processor and memory;
the memory stores computer-executable instructions;
the at least one processor executes the computer-executable instructions stored by the memory to cause the at least one processor to perform the method for processing based on live action engagement as defined in any one of the first aspects.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, where computer-executable instructions are stored in the computer-readable storage medium, and when a processor executes the computer-executable instructions, the processing method based on the live fighting interaction in live watching is implemented as in any one of the first aspects.
The embodiment of the invention provides a processing method, a device and equipment based on interaction in live fighting, which are used for responding to touch operation acted in a game scene to acquire first target drawing information corresponding to the touch operation, performing rendering display in the game scene according to the first target drawing information, and then sending the first target drawing information to a server end so that the server end synchronizes the first target drawing information to at least one second fighting client end to perform rendering display processing according to the first target drawing information, and because the fighting client end can determine the target drawing information based on the acquired touch operation and forwards the target drawing information to other fighting client ends participating in live fighting through a server when the fighting is carried out in live fighting, other fighting client ends can synchronously render display according to the target drawing information, the method is not limited to the prior art, communication is only carried out through the character modes such as the chat channel or the barrage and the like, and the technical problem of inaccuracy of the character communication mode is further solved, so that the semantic communication accuracy in the communication process is improved, and the participation experience of the fighting user is further improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
FIG. 1 is a diagram of a game application scenario provided by an embodiment of the present invention;
fig. 2 is a schematic structural diagram of an application system of a processing method based on live broadcast fighting interaction according to an embodiment of the present invention;
fig. 3 is a schematic flow chart of a processing method based on interaction in live broadcast fighting according to an embodiment of the present invention;
fig. 4 is an application schematic diagram of a processing method based on interaction in live broadcast fighting according to an embodiment of the present invention;
FIG. 5 is a schematic view of a game scene interface provided in an embodiment of the present invention;
FIG. 6 is a flowchart illustrating a processing method based on interaction in live broadcast fighting according to a second embodiment of the present invention
FIG. 7 is a schematic view of a game scene interface according to another embodiment of the present invention;
fig. 8 is a schematic flow chart of a processing method based on interaction in live broadcast fighting according to a third embodiment of the present invention;
FIG. 9 is a schematic structural diagram of a processing apparatus based on live broadcast fighting interaction according to an embodiment of the present invention;
fig. 10 is a schematic hardware structure diagram of a processing device based on interaction in live broadcast fighting according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims, as well as in the drawings, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used are interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of including other sequential examples in addition to those illustrated or described. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
During the game live broadcast, the spectator may have some ideas and game strategies to communicate with other spectator users during the game battle watching process, for example, fig. 1 is a game application scene diagram provided by an embodiment of the present invention, as shown in fig. 1, in which the spectator user wants to express the action route of "player 1", and should make an attack by going around to the forest and walking behind "player 4". In the prior art, the spectator and battle users can only communicate through text forms such as chat channels or barrage on spectator and battle clients, in the embodiment, because woods are arranged on the left side and the right side of the players, the intention of the expressors is difficult to correctly understand through the texts, the accuracy of the communication process cannot be guaranteed, and the participation experience of the spectator and battle users is further reduced.
Based on the problems, the method and the device for achieving the goal drawing of the game scene have the advantages that the goal drawing information corresponding to the touch operation acting on the game scene is obtained firstly, then the goal drawing information is synchronized to other fighting client sides through the server side to be displayed in a rendering mode, namely the character mode is replaced by the drawing mode, so that convenience and rapidness of communication can be improved, and the technical effect of improving semantic transmission accuracy in the communication process can be effectively achieved.
Fig. 2 is a schematic structural diagram of an application system of a processing method based on live broadcast fighting interaction according to an embodiment of the present invention, and as shown in fig. 2, the system mainly includes a plurality of objective terminals for fighting; the spectator client may be a mobile phone, a tablet, etc., and the schematic diagram takes the first spectator client 201 and the second spectator client 203 as an example; in addition, the system further comprises: the server side 202. The first spectator client 201 may be a client that draws target drawing information, and then forwards the target drawing information to other second spectator clients participating in live spectator through the server 202, for example, the second spectator objective 203 performs rendering display.
In addition, the first spectator client 201 and the second spectator client 203 both have the functions of a first spectator objective end and a second spectator objective end, that is, each spectator client can not only draw a client of target drawing information, but also forward the target drawing information to other spectator objective ends participating in live broadcast spectator through the server 202, and can also receive the target drawing information sent by other spectator clients through the server for rendering and displaying.
The technical solution of the present invention will be described in detail below with specific examples. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments.
Fig. 3 is a flowchart illustrating a processing method based on interaction in live viewing and fighting according to an embodiment of the present invention, where the method of this embodiment may be executed by the first viewing and fighting client 201. As shown in fig. 3, the method of this embodiment, providing a graphical user interface through a spectator client, where the graphical user interface displays a game scene, may include:
s301: the first spectator client responds to touch operation acting on the game scene to acquire first target drawing information corresponding to the touch operation, and rendering and displaying are carried out in the game scene according to the first target drawing information.
In this embodiment, in the live broadcast fighting process of the spectator fighting user, some ideas or game strategies may want to communicate with other spectator fighting users, and in order to improve the convenience of communication and the accuracy of expression, the spectator fighting user can directly draw on the spectator fighting client, that is, the spectator fighting client can sense the touch operation acting on the game scene, and then acquire the first target drawing information corresponding to the touch operation.
Optionally, the obtained first target drawing information is a corresponding sequential relationship between the position point information and the position point information involved in the touch operation process. For example, the position point information related to the touch operation may be sequentially stored in a queue according to a drawing sequence, and the queue itself has a first-in first-out characteristic, so that the position point information is sequentially acquired from the queue on the first objective terminal according to the sequence of the stored position points for rendering, and the first target drawing information is further rendered and displayed.
Alternatively, the first object drawing information may be a circle, a line, or other irregular shape.
S302: the first spectator client sends the first target drawing information to the server so that the server synchronizes the first target drawing information to at least one second spectator client, and the rendering display processing is carried out according to the first target drawing information.
In this embodiment, the first spectator client may send the acquired first target drawing information to other spectator clients through the server for rendering display, so that other spectator users may accurately grasp the meaning that the spectator users want to express.
Optionally, after each other objective terminals for spectator fighting, such as the second spectator fighting client described above, receives the first target drawing information, the interface of the other objective terminals for spectator fighting may be rendered and displayed based on the first target drawing information, such as the location point information in the first target drawing information, so that users of the other spectator fighting client terminals can quickly and intuitively and clearly know the meaning that the first spectator fighting client terminal wants to express.
In addition, other spectator client sides can also carry out display setting according to actual requirements, for example, if spectator users do not want to view ideas or strategies shared by other spectator users, the display options can be closed in advance on the spectator client sides, and the spectator client sides corresponding to the spectator users can receive target drawing information drawn by other spectator users but cannot render and display the target drawing information in the live broadcast process of the spectator and battle game; when the spectator and war users want to check the shared ideas or strategies of other spectator and war users, the display options can be opened on the spectator and war client, then the spectator and war users can continue to receive and render and display the target drawing information drawn by other spectator and war users in the live broadcasting process of spectator and war games, the flexibility of setting is improved, the actual requirements of the users are met, and the participation experience of the users is further improved.
For example, fig. 4 is an application schematic diagram of the processing method based on interaction in live broadcast fighting provided by the embodiment of the present invention, as shown in fig. 4, continuing with the example shown in fig. 1, during the live broadcast fighting process, a fighting user wants to express an action route of a player 1, and should go around a forest and go behind the player 4 to attack, because the left and right sides of the player have forests, the intention of the fighting user is difficult to correctly understand through characters, and the idea of the fighting user can be clearly expressed by simply drawing a route in a game scene according to the mode in fig. 4.
After the scheme is adopted, the spectator client can acquire the target drawing information corresponding to the drawing operation acting in the game scene during live watching, and forwards the acquired target drawing information to other spectator client participating in live broadcasting through the server, so that other spectator client can synchronously render and display according to the target drawing information, and is not limited to communication only through text forms such as chat channels or barrage and the like, the semantic communication accuracy in the communication process is improved, and the participation experience of spectator users is further improved.
Based on this, in this embodiment, by responding to a touch operation applied to a game scene to obtain first target drawing information corresponding to the touch operation, performing rendering display in the game scene according to the first target drawing information, and then sending the first target drawing information to a server, so that the server synchronizes the first target drawing information to at least one second spectator client to perform rendering display processing according to the first target drawing information, since the spectator client can determine the target drawing information based on the obtained touch operation during live watching, and forward the target drawing information to other spectator clients participating in live watching through the server, so that the other spectator clients can synchronously render display according to the target drawing information, it is no longer limited to only performing communication in text forms such as chat channels or barrages in the prior art, and the technical problem of inaccuracy of a character communication mode is further solved, so that the semantic communication accuracy in the communication process is improved, and the participation experience of the fighting users is improved.
Fig. 6 is a schematic flow chart of a processing method based on live viewing and fighting interaction according to a second embodiment of the present invention, and based on the embodiment shown in fig. 3 and as shown in fig. 5, fig. 5 is a schematic view of a game scene interface according to the second embodiment of the present invention, where the graphical user interface includes a start button, and then a specific implementation manner of S301 is:
and when the first spectator client identifies the first touch operation of the start key, responding to the touch operation acted in the game scene to acquire first target drawing information corresponding to the touch operation.
In this embodiment, when the touch operation of the spectator user is obtained, whether the spectator user touches the start button or not can be identified first, and after the spectator user touches the start button, the touch operation acting on the game scene is responded to obtain the first target drawing information corresponding to the touch operation, so that the trouble caused by misoperation of the spectator user to other spectator users is avoided.
Further, as shown in fig. 6, in S301, in response to a touch operation applied to a game scene, a specific implementation manner of the first spectator client obtaining the first target drawing information corresponding to the touch operation is as follows:
s601, the first spectator client identifies two-dimensional coordinate position information of each touch position under the touch operation.
S602, the first spectator client sequentially converts the two-dimensional coordinate position information of each touch position into three-dimensional coordinate position information applied to a game scene, and the three-dimensional coordinate position information sequentially acquired is used as first target drawing information.
In this embodiment, the first spectator client acquires two-dimensional coordinate position information of the spectator user touching the current screen, converts the two-dimensional coordinate position information into three-dimensional coordinate position information in the game scene, and sequentially stores the three-dimensional coordinate position information acquired in sequence to obtain first target drawing information. The two-dimensional coordinate position information is converted into the three-dimensional coordinate position information by using the existing technology. For example, projection conversion technology or other technologies capable of implementing the function are not particularly limited herein.
Further, fig. 7 is a schematic view of a game scene interface according to another embodiment of the present invention, as shown in fig. 7, where the graphical user interface further includes a set button, the method may further include:
and the first spectator client responds to the second touch operation acting on the setting key to update the drawing parameters corresponding to the setting key.
Wherein, the drawing parameter is at least one or a combination of a plurality of line thickness, line shape and line color.
A specific implementation manner of performing rendering display in the game scene according to the first target drawing information in S301 is as follows:
rendering in the game scene according to the first target drawing information and the drawing parameters to display the shape corresponding to the first target drawing information, wherein the parameters of the shape conform to the drawing parameters.
In this embodiment, when the spectator client performs rendering display, besides the basic location point information, the spectator client may further include a drawing parameter, where the drawing parameter is a parameter for modifying a drawn effect, and may be at least one or a combination of line thickness, line shape, and line color. The corresponding shape can be determined from the sequentially stored basic position point information, and the determined shape can be modified according to the drawing parameters. For example, the color of the line may be set to red, or the shape of the line may be set to a dotted line, or the like.
In addition, when the corresponding drawing parameters are updated according to the setting keys, the touch operation may be performed by the spectator user before the touch operation is performed, or the touch operation may be performed by the spectator user after the touch operation is performed by the spectator user. That is, the time limit of the update processing of the rendering parameters is not particularly limited.
In the embodiment, the drawing parameters can be updated in real time, so that the flexibility of the setting of the spectator and fighting users is improved, and the participation experience of the spectator and fighting users is also improved. And when the drawing parameters are updated, the drawing parameters can be updated according to the hobbies and actual requirements of the spectator and war users, so that the flexibility of the setting of the spectator and war users is further improved.
Fig. 8 is a schematic flow chart of a processing method based on live broadcast fighting interaction according to a third embodiment of the present invention, and based on the embodiment shown in fig. 3 or fig. 6, the method may further include:
s801, the first spectator client acquires second target drawing information sent by a second spectator client synchronized with the server, and performs rendering display processing in the game scene according to the second target drawing information.
Optionally, in S801, according to the second target drawing information, a specific implementation manner of performing rendering and display processing in the game scene is as follows:
rendering in the game scene according to the second target drawing information and the drawing parameters to display a shape corresponding to the second target drawing information, wherein the parameters of the shape conform to the drawing parameters.
Specifically, when the first spectator client sends the target drawing information to other second spectator clients for rendering display, the first spectator client can also receive second target drawing information sent by other second spectator clients through the server, and then the second target drawing information is rendered and displayed in the game scene.
When the first spectator client displays the second target drawing information in a rendering mode, the principle is similar to that when the second spectator client displays the first target drawing information in a rendering mode. Namely, the three-dimensional coordinate position information of each position and the corresponding sequence are determined. And then rendering is carried out in the game scene according to the three-dimensional coordinate position information of each position and the corresponding sequence so as to display the shape corresponding to the second target drawing information.
Optionally, the permission of the spectator client may also be set, for example, the permission of the first spectator client may be set as the talkback permission, and the permissions of the other second spectator clients may be set as the audience permission. Based on this, only the first spectator client can draw information and synchronously display to other second spectator clients in a rendering mode, and the second spectator client can only display in a rendering mode and cannot perform touch operation, so that the problem that the thinking and the strategy of spectator users with higher display level cannot be accurately displayed due to too many spectator people and too much drawing information is avoided.
Optionally, the second spectator client may also perform update processing on the drawing parameters, and the first spectator client may render in the game scene to display a shape corresponding to the second target drawing information when receiving the second target drawing information and the drawing parameters, which are sent by the second spectator client through the server, and the parameters of the shape conform to the drawing parameters. The principle is similar to the update process of the drawing parameters performed by the first spectator client, and the description is not repeated here.
An embodiment of the present specification further provides a device corresponding to the method, fig. 9 is a schematic structural diagram of a processing device based on interaction in live broadcast fighting provided in an embodiment of the present invention, and as shown in fig. 9, a graphical user interface is provided by a fighting client, where the graphical user interface displays a game scene, and may include: a touch module 901 and a processing module 902; the touch module 901 is configured to respond to a touch operation applied to a game scene by a first spectator client to acquire first target drawing information corresponding to the touch operation, and perform rendering display in the game scene according to the first target drawing information. The processing module 902 is configured to send the first target drawing information to a server by the first spectator client, so that the server synchronizes the first target drawing information to at least one second spectator client, and performs rendering and display processing according to the first target drawing information.
The apparatus of this embodiment may perform the above-described embodiment of the method shown in fig. 3, and the implementation principle and the technical effect are similar, which are not described herein again.
In one embodiment, the graphical user interface includes a start button, and the touch module is further configured to:
and when the first spectator client identifies the first touch operation of the start key, responding to the touch operation acted in the game scene to acquire first target drawing information corresponding to the touch operation.
In one embodiment, the touch module is further configured to:
the first spectator client identifies two-dimensional coordinate position information of each touch position under the touch operation.
And the first spectator client sequentially converts the two-dimensional coordinate position information of each touch position into three-dimensional coordinate position information applied to the game scene, so that the sequentially acquired three-dimensional coordinate position information is used as first target drawing information.
In a specific embodiment, the graphical user interface further includes a setting button, and the touch module is further configured to:
and the first spectator client responds to the second touch operation acting on the setting key to update the drawing parameters corresponding to the setting key.
The drawing parameters are at least one or a combination of a plurality of line thicknesses, line shapes and line colors.
In one embodiment, the touch module is further configured to:
rendering in the game scene according to the first target drawing information and the drawing parameters so as to display the shape corresponding to the first target drawing information, wherein the parameters of the shape conform to the drawing parameters.
In one embodiment, the processing module is further configured to:
and the first spectator client acquires second target drawing information sent by a second spectator client synchronized with the server, and performs rendering display processing in the game scene according to the second target drawing information.
In one embodiment, the processing module is further configured to:
and analyzing the second target drawing information, and determining the three-dimensional coordinate position information of each position and the corresponding sequence.
Rendering in the game scene according to the three-dimensional coordinate position information of each position and the corresponding sequence so as to display the shape corresponding to the second target drawing information.
In one embodiment, the processing module is further configured to:
rendering in the game scene according to the second target drawing information and the drawing parameters to display a shape corresponding to the second target drawing information, wherein the parameters of the shape conform to the drawing parameters.
The apparatus provided in the embodiment of the present invention may implement the method according to the embodiment shown in fig. 3, and the implementation principle and the technical effect are similar, which are not described herein again.
Fig. 10 is a schematic hardware structure diagram of a processing device based on interaction in live broadcast fighting according to an embodiment of the present invention. As shown in fig. 10, the present embodiment provides an apparatus 1000 including: at least one processor 1001 and memory 1002. The processor 1001 and the memory 1002 are connected to each other via a bus 1003.
In a specific implementation process, at least one processor 1001 executes computer-executable instructions stored in the memory 1002, so that at least one processor 1001 executes the method in the above method embodiment.
For a specific implementation process of the processor 1001, reference may be made to the above method embodiments, which have similar implementation principles and technical effects, and details of this embodiment are not described herein again.
In the embodiment shown in fig. 10, it should be understood that the Processor may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in connection with the present invention may be embodied directly in a hardware processor, or in a combination of the hardware and software modules within the processor.
The memory may comprise high speed RAM memory and may also include non-volatile storage NVM, such as at least one disk memory.
The bus may be an Industry Standard Architecture (ISA) bus, a Peripheral Component Interconnect (PCI) bus, an Extended ISA (EISA) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, the buses in the figures of the present application are not limited to only one bus or one type of bus.
The embodiment of the invention also provides a computer-readable storage medium, wherein a computer execution instruction is stored in the computer-readable storage medium, and when a processor executes the computer execution instruction, the processing method based on the live broadcast fighting interaction is realized.
The computer-readable storage medium may be implemented by any type of volatile or non-volatile memory device or combination thereof, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk. Readable storage media can be any available media that can be accessed by a general purpose or special purpose computer.
An exemplary readable storage medium is coupled to the processor such the processor can read information from, and write information to, the readable storage medium. Of course, the readable storage medium may also be an integral part of the processor. The processor and the readable storage medium may reside in an Application Specific Integrated Circuits (ASIC). Of course, the processor and the readable storage medium may also reside as discrete components in the apparatus.
Those of ordinary skill in the art will understand that: all or a portion of the steps of implementing the above-described method embodiments may be performed by hardware associated with program instructions. The program may be stored in a computer-readable storage medium. When executed, the program performs steps comprising the method embodiments described above; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (11)

1. A processing method based on interaction in live broadcast fighting is characterized in that a graphical user interface is provided through a fighting client, the graphical user interface displays a game scene, and the processing method comprises the following steps:
the first spectator client responds to touch operation acting on a game scene to acquire first target drawing information corresponding to the touch operation, and rendering and displaying are carried out in the game scene according to the first target drawing information;
and the first spectator client sends the first target drawing information to a server so that the server synchronizes the first target drawing information to at least one second spectator client, and the rendering display processing is carried out according to the first target drawing information.
2. The method of claim 1, wherein the first spectator client is responsive to a touch operation applied to the game scene to obtain first target drawing information corresponding to the touch operation, and comprises:
the first fighting client identifies two-dimensional coordinate position information of each touch position under touch operation;
and the first spectator client sequentially converts the two-dimensional coordinate position information of each touch position into three-dimensional coordinate position information applied to the game scene, and uses the sequentially acquired three-dimensional coordinate position information as the first target drawing information.
3. The method of claim 1, wherein a start button is included in the graphical user interface, and the first spectator client responds to a touch operation applied to the game scene to obtain first target drawing information corresponding to the touch operation, and includes:
and when the first spectator client identifies the first touch operation of the start key, responding to the touch operation acted in the game scene to acquire first target drawing information corresponding to the touch operation.
4. The method of claim 1, wherein the graphical user interface further comprises a setup button, the method further comprising:
and the first spectator client responds to a second touch operation acting on the setting key to update the drawing parameters corresponding to the setting key, wherein the drawing parameters are at least one or a combination of line thickness, line shape and line color.
5. The method of claim 4, wherein the rendering and displaying in the game scene according to the first target drawing information comprises:
rendering in the game scene according to the first target drawing information and the drawing parameters to display the shape corresponding to the first target drawing information, wherein the parameters of the shape conform to the drawing parameters.
6. The method of claim 1 or 4, further comprising:
and the first spectator client acquires second target drawing information sent by a second spectator client synchronized with the server, and performs rendering display processing in the game scene according to the second target drawing information.
7. The method according to claim 6, wherein the rendering and displaying in the game scene according to the second object drawing information includes:
analyzing the second target drawing information, and determining three-dimensional coordinate position information and a corresponding sequence of each position;
rendering in the game scene according to the three-dimensional coordinate position information of each position and the corresponding sequence so as to display the shape corresponding to the second target drawing information.
8. The method according to claim 6, wherein the rendering and displaying in the game scene according to the second object drawing information includes:
rendering in the game scene according to the second target drawing information and the drawing parameters to display a shape corresponding to the second target drawing information, wherein the parameters of the shape conform to the drawing parameters.
9. A processing apparatus based on interaction in live fighting, characterized in that, through watching the client side and providing the graphical user interface, the graphical user interface shows the game scene, including:
the touch control module is used for responding to touch operation acted in a game scene by the first spectator client so as to obtain first target drawing information corresponding to the touch operation and performing rendering display in the game scene according to the first target drawing information;
and the processing module is used for sending the first target drawing information to the server side by the first spectator client side so as to enable the server side to synchronize the first target drawing information to at least one second spectator client side, and then performing rendering display processing according to the first target drawing information.
10. The utility model provides a processing equipment based on interactive in live broadcast impression battle, its characterized in that includes: at least one processor and memory;
the memory stores computer-executable instructions;
the at least one processor executing the computer-executable instructions stored by the memory causes the at least one processor to perform the method of processing based on live action in battle field interaction of any one of claims 1 to 8.
11. A computer-readable storage medium, wherein the computer-readable storage medium stores computer-executable instructions, which when executed by a processor, implement a processing method based on live fighting interaction according to any one of claims 1 to 8.
CN202010393726.1A 2020-05-11 2020-05-11 Processing method, device and equipment based on interaction in live broadcast fighting Pending CN111569436A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010393726.1A CN111569436A (en) 2020-05-11 2020-05-11 Processing method, device and equipment based on interaction in live broadcast fighting

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010393726.1A CN111569436A (en) 2020-05-11 2020-05-11 Processing method, device and equipment based on interaction in live broadcast fighting

Publications (1)

Publication Number Publication Date
CN111569436A true CN111569436A (en) 2020-08-25

Family

ID=72113391

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010393726.1A Pending CN111569436A (en) 2020-05-11 2020-05-11 Processing method, device and equipment based on interaction in live broadcast fighting

Country Status (1)

Country Link
CN (1) CN111569436A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112221148A (en) * 2020-10-15 2021-01-15 网易(杭州)网络有限公司 Game skill release state synchronization method, server and readable storage medium
CN113613060A (en) * 2021-08-03 2021-11-05 广州繁星互娱信息科技有限公司 Drawing live broadcast method, device, equipment and storage medium
CN113633996A (en) * 2021-08-12 2021-11-12 腾讯科技(成都)有限公司 Chat message transmission method and device, storage medium and electronic equipment
CN116248912A (en) * 2023-05-12 2023-06-09 南京维赛客网络科技有限公司 Method, system and storage medium for annotating live streaming picture in real time

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105187930A (en) * 2015-09-18 2015-12-23 广州酷狗计算机科技有限公司 Video live broadcasting-based interaction method and device
CN108768832A (en) * 2018-05-24 2018-11-06 腾讯科技(深圳)有限公司 Exchange method and device, storage medium, electronic device between client
CN110090449A (en) * 2019-04-26 2019-08-06 网易(杭州)网络有限公司 System that method is watched in a kind of game and game is watched
CN110881144A (en) * 2018-09-05 2020-03-13 武汉斗鱼网络科技有限公司 Data processing method based on live broadcast platform and related equipment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105187930A (en) * 2015-09-18 2015-12-23 广州酷狗计算机科技有限公司 Video live broadcasting-based interaction method and device
CN108768832A (en) * 2018-05-24 2018-11-06 腾讯科技(深圳)有限公司 Exchange method and device, storage medium, electronic device between client
CN110881144A (en) * 2018-09-05 2020-03-13 武汉斗鱼网络科技有限公司 Data processing method based on live broadcast platform and related equipment
CN110090449A (en) * 2019-04-26 2019-08-06 网易(杭州)网络有限公司 System that method is watched in a kind of game and game is watched

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112221148A (en) * 2020-10-15 2021-01-15 网易(杭州)网络有限公司 Game skill release state synchronization method, server and readable storage medium
CN112221148B (en) * 2020-10-15 2024-03-22 网易(杭州)网络有限公司 Game skill release state synchronization method, server and readable storage medium
CN113613060A (en) * 2021-08-03 2021-11-05 广州繁星互娱信息科技有限公司 Drawing live broadcast method, device, equipment and storage medium
CN113633996A (en) * 2021-08-12 2021-11-12 腾讯科技(成都)有限公司 Chat message transmission method and device, storage medium and electronic equipment
CN113633996B (en) * 2021-08-12 2023-08-25 腾讯科技(成都)有限公司 Chat message transmission method and device, storage medium and electronic equipment
CN116248912A (en) * 2023-05-12 2023-06-09 南京维赛客网络科技有限公司 Method, system and storage medium for annotating live streaming picture in real time

Similar Documents

Publication Publication Date Title
CN110944235B (en) Live broadcast interaction method, device and system, electronic equipment and storage medium
CN106254311B (en) Live broadcast method and device and live broadcast data stream display method and device
CN111569436A (en) Processing method, device and equipment based on interaction in live broadcast fighting
WO2017054465A1 (en) Information processing method, terminal and computer storage medium
CN110856032B (en) Live broadcast method, device, equipment and storage medium
CN109428859B (en) Synchronous communication method, terminal and server
CN109195003B (en) Interaction method, system, terminal and device for playing game based on live broadcast
CN114727146B (en) Information processing method, device, equipment and storage medium
CN113691829B (en) Virtual object interaction method, device, storage medium and computer program product
CN113411656B (en) Information processing method, information processing device, computer equipment and storage medium
CN114390308B (en) Interface display method, device, equipment, medium and product in live broadcast process
CN108012195B (en) Live broadcast method and device and electronic equipment thereof
US20230306694A1 (en) Ranking list information display method and apparatus, and electronic device and storage medium
CN111760266A (en) Game live broadcast method and device and electronic equipment
CN114025180A (en) Game operation synchronization system, method, device, equipment and storage medium
CN112138381A (en) Game data processing method and device, storage medium and electronic device
CN113596555B (en) Video playing method and device and electronic equipment
CN113536147B (en) Group interaction method, device, equipment and storage medium
CN106792237B (en) Message display method and system
CN112187624B (en) Message reply method and device and electronic equipment
US11146413B2 (en) Synchronous communication
US20230195403A1 (en) Information processing method and electronic device
CN112717422B (en) Real-time information interaction method and device, equipment and storage medium
CN111249723B (en) Method, device, electronic equipment and storage medium for display control in game
CN114116105A (en) Control method and device of dynamic desktop, storage medium and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination