CN114307154A - Game live broadcast control method and device, computer storage medium and electronic equipment - Google Patents

Game live broadcast control method and device, computer storage medium and electronic equipment Download PDF

Info

Publication number
CN114307154A
CN114307154A CN202111630501.4A CN202111630501A CN114307154A CN 114307154 A CN114307154 A CN 114307154A CN 202111630501 A CN202111630501 A CN 202111630501A CN 114307154 A CN114307154 A CN 114307154A
Authority
CN
China
Prior art keywords
game
virtual scene
live
elements
type
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111630501.4A
Other languages
Chinese (zh)
Inventor
庄宇轩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Boguan Information Technology Co Ltd
Original Assignee
Guangzhou Boguan Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Boguan Information Technology Co Ltd filed Critical Guangzhou Boguan Information Technology Co Ltd
Priority to CN202111630501.4A priority Critical patent/CN114307154A/en
Publication of CN114307154A publication Critical patent/CN114307154A/en
Pending legal-status Critical Current

Links

Images

Abstract

The disclosure relates to the technical field of internet, and relates to a game live broadcast control method and device, a storage medium and electronic equipment. The method comprises the following steps: receiving an instruction for starting a current game playing method, constructing a virtual scene corresponding to a current game, and rendering the acquired anchor image in the virtual scene; game elements required by the current game are pre-generated in the virtual scene according to the current game playing method; receiving sensing data of a live audience terminal, and controlling the motion of game elements in the virtual scene according to the sensing data; and when the motion state parameters of the game elements reach preset conditions, triggering game events corresponding to the game elements. According to the method and the device, the virtual scene of the current game playing method is established in the live broadcast interface, the game elements required by the current game playing method are generated, the movement of the game elements is controlled according to the sensing data corresponding to the live broadcast audience terminals, and therefore the interaction between the game playing method and the live broadcast content is achieved in the virtual live broadcast.

Description

Game live broadcast control method and device, computer storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of internet technologies, and in particular, to a live game control method, a live game control apparatus, a computer storage medium, and an electronic device.
Background
With the development of live broadcast technology, many people watch the game process of the anchor through live broadcast, and live broadcast interaction is an important contact in the live broadcast process and is an important medium for bearing the relationship between the anchor and users. In the current live broadcast interaction process, a user can establish interactive connection with a main broadcast by paying attention to the main broadcast, sending a barrage and the like, however, a game scene of the user and a live broadcast watching scene are split, no interaction exists between a game playing method and live broadcast contents, and the game interaction experience in live broadcast is weak.
It is to be noted that the information invented in the background section above is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The present disclosure is directed to a method and an apparatus for controlling live game play, a computer storage medium, and an electronic device, so as to implement interaction between a play scene and a live view scene, thereby improving interactivity and interestingness during a live game play process.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to one aspect of the present disclosure, there is provided a game live broadcast control method, including: receiving an instruction for starting a current game playing method, constructing a virtual scene corresponding to a current game, and rendering the acquired anchor image in the virtual scene; game elements required by the current game are pre-generated in the virtual scene according to the current game playing method; receiving sensing data of a live audience terminal, and controlling the motion of game elements in the virtual scene according to the sensing data; and when the motion state parameters of the game elements reach preset conditions, triggering game events corresponding to the game elements.
In an exemplary embodiment of the present disclosure, the pre-generating game elements required for the current game within the virtual scene according to the current game play includes: and generating a template according to preset elements in the current game playing method, and generating a first type game element controlled by a preset playing method logic and a second type game element moving according to sensing data corresponding to the live broadcast audience terminal.
In an exemplary embodiment of the present disclosure, the game elements of the first type controlled by the preset play logic have initial motion state parameters; the method further comprises the following steps: updating the position coordinates of the first type game elements in the virtual scene frame by frame in the process that the first type game elements move according to the initial motion state parameters; and transmitting the updated video stream to the live audience terminal.
In an exemplary embodiment of the present disclosure, the sensing data includes an inclination angle of the live viewer terminal with respect to a preset reference; controlling the motion of a second type of game elements in the virtual scene according to the received sensing data of the live audience terminal, wherein the motion control method comprises the following steps: converting the inclination angle into the moving distance of a second type game element in the virtual scene in each direction according to the current game playing method; and moving the space coordinates of the second game elements in the current video frame according to the moving distance, and transmitting the updated video stream to the live audience terminal.
In an exemplary embodiment of the present disclosure, the sensing data includes a touch trajectory of a touch medium on a live viewer terminal interface; controlling the motion of a second type of game elements in the virtual scene according to the received sensing data of the live audience terminal, wherein the motion control method comprises the following steps: converting the coordinates of each touch point in the touch track into the moving coordinates of a second type of game elements in the virtual scene according to the current game playing method; and updating the space coordinates of the second game elements in each video frame according to the moving coordinates, and transmitting the updated video stream to the live audience terminal.
In an exemplary embodiment of the present disclosure, the method further comprises: and acquiring the moving speed of a touch point in the touch track, and taking the moving speed as the moving speed of a second type of game elements in the virtual scene.
In an exemplary embodiment of the present disclosure, the sensing data includes a parallel movement distance of the live viewer terminal parallel to a side of the live viewer terminal; the controlling the motion of the second type of game elements in the virtual scene according to the received sensing data of the live audience terminal comprises the following steps: converting the parallel movement distance into a movement distance of a second type of game element motion in the virtual scene along a horizontal direction or a movement distance perpendicular to the horizontal direction in the virtual scene according to the current game playing method; and moving the space coordinates of the second game elements in the current video frame according to the moving distance, and transmitting the updated video stream to the live audience terminal.
In an exemplary embodiment of the present disclosure, the triggering a game event corresponding to the game element when the motion state parameter of the game element reaches a preset condition includes: when the spatial coordinates of the first game element and the second game element in the virtual scene are coincident, triggering a game event corresponding to the game element.
In an exemplary embodiment of the present disclosure, the triggering a game event corresponding to the game element when the first type game element and the second type game element coincide with a spatial coordinate in the virtual scene includes: determining collision parameters according to at least one of the current motion speed and the motion acceleration of each game element and the number of the game elements with overlapped space coordinates when the first game element and the second game element are overlapped in space coordinates; triggering different types of game events according to the collision parameters; wherein, the collision parameters at least comprise collision angle and force.
In an exemplary embodiment of the present disclosure, the triggering a game event corresponding to the game element when the motion state parameter of the game element reaches a preset condition includes: and when the position coordinates of the first type game elements and/or the second type game elements are coincident with the target limb position coordinates in the anchor patch in the virtual scene, triggering game events corresponding to the game elements.
In an exemplary embodiment of the present disclosure, the triggering a game event corresponding to the game element includes: determining the type of the game event according to the type of the target limb.
In an exemplary embodiment of the present disclosure, the game event corresponding to the game element includes changing at least one of a motion state parameter, a motion trajectory, a shape, a volume, a color, a special effect, and a game element type of the game element.
According to an aspect of the present disclosure, there is provided a live game control apparatus including: the virtual scene building module is used for receiving an instruction for starting the current game playing method, building a virtual scene corresponding to the current game, and rendering the acquired anchor image in the virtual scene; a game element generation module, configured to pre-generate game elements required by the current game in the virtual scene according to the current game play; the motion control module is used for receiving sensing data of a live audience terminal and controlling the motion of game elements in the virtual scene according to the sensing data; and the event triggering module is used for triggering the game event corresponding to the game element when the motion state parameter of the game element reaches a preset condition.
According to an aspect of the present disclosure, there is provided a computer storage medium having a computer program stored thereon, the computer program, when executed by a processor, implementing a live game control method as described in any one of the above.
According to an aspect of the present disclosure, there is provided an electronic device including: a processor; and a memory for storing executable instructions of the processor; wherein the processor is configured to perform any one of the above-described live game control methods via execution of the executable instructions.
In the game live broadcast control method in the exemplary embodiment of the disclosure, after an instruction for starting a current game playing method is received, a virtual scene corresponding to a current game is constructed, and a collected anchor image is rendered in the virtual scene; and then generating game elements required by the current game in the virtual scene, controlling the motion of the game elements in the virtual scene by acquiring the sensing data of the live audience terminal after the game starts, and triggering a corresponding game event when the motion state parameters of the game elements reach preset conditions. The live broadcast room based on the virtual broadcasting can construct a virtual scene corresponding to the current game playing method, render the acquired anchor image in the virtual scene, and generate game elements required by the current game in the virtual scene, thereby enriching the live broadcast form of the game; the method has the advantages that game element movement in the virtual scene is controlled according to sensing data corresponding to the live audience terminals, interaction between the play scene and the live scene is generated, game events are triggered, interaction among the anchor terminals, the live audience terminals and live game content can be achieved based on control over the game elements, the interactivity of live game is enhanced, and users can obtain immersive interactive experience.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The above and other objects, features and advantages of exemplary embodiments of the present disclosure will become readily apparent from the following detailed description read in conjunction with the accompanying drawings. Several embodiments of the present disclosure are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which:
fig. 1 shows a flowchart of a live game control method according to an exemplary embodiment of the present disclosure;
fig. 2 illustrates a flowchart for adjusting a screen display state of a virtual scene according to a current play method according to an exemplary embodiment of the present disclosure;
FIG. 3 illustrates an example diagram of translating coordinates of a touch point in a touch trajectory to coordinates of movement of a second game element within a virtual scene according to an example embodiment of the present disclosure;
fig. 4 shows a schematic structural diagram of a live game control apparatus according to an exemplary embodiment of the present disclosure;
FIG. 5 shows a schematic diagram of a storage medium according to an exemplary embodiment of the present disclosure; and
fig. 6 shows a block diagram of an electronic device according to an exemplary embodiment of the present disclosure.
In the drawings, the same or corresponding reference numerals indicate the same or corresponding parts.
Detailed Description
Exemplary embodiments will now be described more fully with reference to the accompanying drawings. The exemplary embodiments, however, may be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of exemplary embodiments to those skilled in the art. The same reference numerals in the drawings denote the same or similar structures, and thus their detailed description will be omitted.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the subject matter of the present disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and so forth. In other instances, well-known structures, methods, devices, implementations, or operations are not shown or described in detail to avoid obscuring aspects of the disclosure.
The block diagrams shown in the figures are functional entities only and do not necessarily correspond to physically separate entities. That is, these functional entities may be implemented in the form of software, or in one or more software-hardened modules, or in different networks and/or processor devices and/or microcontroller devices.
In the existing game live broadcast technology, a User play method and other UGC (User-generated Content) are difficult to directly interact with a live broadcast video picture (live broadcast Content), namely, a play scene and a live broadcast watching scene are split from each other, for example, the mobile incapability of controlling game elements by a User is reflected in a live broadcast interface, the interactivity among a main broadcast terminal, a live broadcast audience terminal and the live broadcast game Content in the game live broadcast process is low, and the immersive interaction between the game play method and the live broadcast watching scene is difficult to really realize.
Based on this, in the exemplary embodiment of the present disclosure, a game live control method is first provided. Referring to fig. 1, the live game control method includes the following steps:
step S110: receiving an instruction for starting a current game playing method, constructing a virtual scene corresponding to a current game, and rendering the acquired anchor image in the virtual scene;
step S120: game elements required by the current game are pre-generated in the virtual scene according to the current game playing method;
step S130, receiving sensing data of a live audience terminal, and controlling the movement of game elements in a virtual scene according to the sensing data;
step S140, when the motion state parameter of the game element reaches a preset condition, triggering a game event corresponding to the game element.
According to the live game control method in the embodiment of the example, based on the live broadcast room of the virtual broadcast, the virtual scene corresponding to the current game playing method can be constructed, the acquired anchor image is rendered in the virtual scene, the game elements required by the current game are generated in the virtual scene, and the live game form is enriched; the method has the advantages that game element movement in the virtual scene is controlled according to sensing data corresponding to the live audience terminals, interaction between the play scene and the live scene is generated, game events are triggered, interaction among the anchor terminals, the live audience terminals and live game content can be achieved based on control over the game elements, the interactivity of live game is enhanced, and users can obtain immersive interactive experience.
In the exemplary embodiment of the present disclosure, first, the anchor terminal performs virtual broadcast first, and can complete virtual broadcast through a virtual broadcast function provided by a special live broadcast platform, or complete virtual broadcast by an official in an offline scene, and after the virtual broadcast, the live broadcast viewer is in a virtual broadcast state, and enters a live broadcast room corresponding to the anchor broadcast in the virtual broadcast to watch live broadcast. The virtual broadcasting means that the anchor is positioned in front of a green screen, and the background behind the anchor is a virtual scene rendered by a rendering engine, and the scene can be changed.
The method for controlling the live game in the exemplary embodiment of the present disclosure is further described below with reference to fig. 1.
In step S110, an instruction to start a current game play is received, a virtual scene corresponding to the current game is constructed, and the acquired anchor image is rendered in the virtual scene.
In an exemplary embodiment of the present disclosure, the content presented in the live interface includes a virtual scene rendered with an anchor character. After receiving an instruction to start a current game play, in an exemplary embodiment of the present disclosure, the construction of a virtual scene corresponding to a current game may be completed through the following steps:
firstly, collecting a main broadcast video, and matting a video frame of the main broadcast video to obtain a main broadcast image; secondly, a virtual scene corresponding to the current game playing method is established through a game rendering engine (such as a rendering engine UE), and the anchor image is rendered in the virtual scene (such as the anchor image is pasted on a face in the virtual scene), so that the virtual scene corresponding to the current game playing method displayed in a live interface is obtained, and the virtual playing is completed. The shooting can be carried out in the virtual scene through a virtual camera arranged in the virtual scene, and the shot content is the virtual scene which is displayed in the live interface and is corresponding to the current game playing method and provided with the anchor image in a rendering mode.
The collected anchor image can be rendered in a virtual scene constructed based on the current game playing method through the exemplary embodiment of the present disclosure, so that the fusion of the anchor image and the game playing method scene is realized, and the immersion feeling of watching the live game is improved through displaying in the live interface of the live audience terminal.
In some possible embodiments, the instruction for starting the current game play may be, for example, a touch operation of the live audience terminal on the live audience terminal by a user through a touch medium, such as a single click, a double click, a triple click, a sliding operation, and the like of the user on a terminal interface. As another example, the instruction to start the current game play may be that the live audience terminal produces the preset UGC content, for example, a preset password is sent through a bullet screen. For another example, the instruction for starting the current game play may be that the live-broadcast audience terminal rotates, shakes, or the like through the control terminal corresponding to the user. Of course, the instruction to start the current gameplay may also be determined according to the actual gameplay, and the present disclosure includes, but is not limited to, the above-mentioned received instruction to start the current gameplay.
In some possible implementation manners, after an instruction for starting the current game play is received, a template can be triggered according to the play in the current game play, prompt information is displayed on a live broadcast interface so as to be used for confirming whether the current game play is performed to a user corresponding to the live broadcast audience terminal again, and after confirmation response information of the user is received, a virtual scene is constructed according to the current game play, so that mode switching caused by misoperation of the user corresponding to the live broadcast audience terminal can be avoided.
In some possible embodiments, the display state of the screen of the virtual scene may be adjusted, including but not limited to the display direction (landscape screen, portrait screen), and the size of the screen. Ratios, and the like. Fig. 2 illustrates a flowchart of adjusting a screen display state of a virtual scene according to a current game play according to an exemplary embodiment of the present disclosure, and as shown in fig. 2, the process includes:
in step S210, according to the current game playing method, the position and the angle of the virtual camera in the virtual scene are adjusted to adjust the screen display state of the virtual scene.
In an exemplary embodiment of the present disclosure, the position and the angle of the virtual camera in the virtual scene may be adjusted according to the current game play method, for example, the position and the angle of the virtual camera are adjusted to adjust the shot picture from the original horizontal picture to the vertical picture, and the display scale and the size of the virtual scene may be adjusted according to different game play methods, so as to adjust the virtual scene to the picture display state according to the current play method, for example, the full screen state may be adjusted, thereby facilitating the control of the live audience terminal corresponding to the user.
In step S220, an adjustment completion instruction is sent to the live viewer terminal.
In an exemplary embodiment of the disclosure, the adjustment completion instruction is to instruct the live audience terminal to update a take-over component of the live interface according to the current game play.
Specifically, after the picture display state of the virtual scene is adjusted, an adjustment completion instruction is sent to the live broadcast audience terminal, the live broadcast audience terminal is instructed to update the original live broadcast interface carrying component, including but not limited to hiding functional components or controls which are not needed under the current game playing method, increasing target functional components or controls which are needed by the current game playing method and the like, and therefore complete immersive interaction is achieved through the live broadcast interface.
In step S120, game elements required for the current game are pre-generated within the virtual scene according to the current game play.
In an exemplary embodiment of the present disclosure, the game elements include first-class game elements controlled by preset play logic, such as models with different initial motion speeds in a game, model props with special gain function effects or state effects (e.g., model props of bombs, magic), and the like, and the first-class game elements generally have initial motion state parameters in the game, wherein the initial motion state parameters of the first-class game elements include, but are not limited to, initial position coordinates, motion speeds, gravity vector directions, and the like. According to the method, in the process that the first game elements move according to the initial motion state parameters, the position coordinates of the first game elements in the virtual scene are updated frame by frame, and the updated video stream is transmitted to the live-broadcast audience terminal.
In an exemplary embodiment of the present disclosure, the game elements further include a second type of game elements that perform a motion according to the sensing data corresponding to the live audience terminal, for example, model props that can be manipulated by users corresponding to the live audience terminal in the game, such as organs, virtual characters, and the like.
The method and the device for playing the game in the virtual scene can pre-generate the first game elements and the second game elements required by the current game in the virtual scene according to the current game playing method, convert the content in the game playing method into the interactive game elements in the virtual live broadcast room, use the game elements as the display content in the virtual scene, and use the anchor, the game elements and the virtual scene as the display content of a game live broadcast interface at the same time, so that the interaction between the game scene and the game live broadcast scene is realized, and the interactive experience of game live broadcast is improved.
In step S130, sensing data of the live viewer terminal is received, and the motion of the game element in the virtual scene is controlled according to the sensing data.
In an exemplary embodiment of the present disclosure, after the current game is triggered, the sensing data of the live audience terminal may be acquired, and the motion of the second type game element in the virtual scene may be controlled according to the sensing data of the live audience terminal.
In some possible embodiments, the sensing data includes an inclination angle of the live viewer terminal relative to a preset reference, the inclination angle includes but is not limited to an inclination angle, an azimuth angle and a rotation angle, corresponding preset references can be preset for different types of inclination angles, the preset references are determined according to an initial position of the live viewer terminal, for example, the preset references can be initial positions of central axes of the live viewer terminal, of course, the preset references can also be determined in other manners, and the preset references can be used as references for determining the inclination angle of the live viewer terminal, which is not particularly limited by the present disclosure.
Furthermore, the inclination angle can be processed according to the current game playing method, so that the inclination angle is converted into the moving distance of the second game elements in the virtual scene in each direction, then the spatial coordinates of the second game elements in the current video frame are moved according to the moving distance, and the updated video is transmitted back to the user terminal, so that the game elements in the virtual scene can be operated through the user terminal. Optionally, a mapping relationship between an inclination angle of the live viewer relative to a preset reference and three-dimensional space coordinates of the second type game elements in the virtual scene may be established, so as to convert the inclination angle into a moving distance of the second type game elements in the virtual scene in each direction according to the mapping relationship. Optionally, a corresponding relationship between the tilt angle and the movement distance value of the game element in the virtual scene may be pre-established to convert the tilt angle into the movement distance of the second type game element in the virtual scene in each direction according to the query corresponding relationship, and of course, the disclosure includes, but is not limited to, the above method of converting the tilt angle into the movement distance of the second type game element in the virtual scene.
In the following, a description will be given of how to convert the tilt angle of the live audience terminal into the moving distance of the second type game element in each direction in the virtual scene, taking the doll receiving game as an example.
Firstly, introducing the playing rule of the doll receiving game: the user receives more fallen dolls in unit time as far as possible by moving the live audience terminal through the receiving container positioned below the interface in the interface of the live audience terminal, and avoids other model props which cause game failure such as bombs and the like, so that higher scores are obtained. The first type of game element is a prop for a fallen doll, such as a bomb, which causes a game failure, and the second type of game element is a receiving container for a user to receive the doll.
Further, after an inclination angle of the live audience terminal relative to a preset reference is obtained, the inclination angle is processed to obtain a scalar value parallel to the controllable receiving container; and then, mapping the scalar value and the moving distance of the virtual scene internal container in the horizontal direction according to the current game playing method, moving the space coordinate of the receiving container in the current video frame according to the moving distance after mapping is finished, and finally transmitting the updated video stream to the live audience terminal, so that the live audience terminal controls the virtual scene internal container.
In some possible embodiments, the sensing data may include a touch trajectory of the touch medium on the live viewer terminal interface, including but not limited to a sliding trajectory of the user on the live viewer terminal interface with a finger, a click position, and the like. After the touch track of the touch medium on the interface of the live audience terminal is obtained, the coordinates of each point in the touch track are converted into the moving coordinates of the second game elements in the virtual scene according to the current game playing method, so that the space coordinates of the second game elements in each video frame are updated according to the moving coordinates, and the updated video stream is transmitted to the live audience terminal.
Fig. 3 is a schematic diagram illustrating a process of converting coordinates of a touch point in a touch track into moving coordinates of a second game element in a virtual scene according to an exemplary embodiment of the present disclosure, where as shown in fig. 3, a user's touch medium (e.g., a finger) moves from a position a to a position B in a live view terminal interface through touch points C and D, the coordinates of the touch points a and B are converted into moving coordinates of an initial position a and moving coordinates of an end position B of the second game element in the virtual scene according to a current game play method, the coordinates of the touch points C and D are converted into moving coordinates C and D of the second game element in the virtual scene, and finally, according to the moving coordinates a, B, C and D and according to a touch sequence, spatial coordinates of the second game element in each video are updated, and the updated video stream is transmitted to the live view terminal, so that the second game element is displayed in the live view terminal from the position a to the position B, therefore, the control of the second game element in the virtual scene by the live audience terminal is realized.
It should be noted that, when the touch trajectory is the click position of the user on the live view terminal interface through the finger, referring to the example shown in fig. 3, after the touch medium of the user is obtained and moved from the position a to the position B on the live view terminal interface, the coordinates of the touch points a and B are converted into the moving coordinates of the initial position a and the moving coordinates of the end position B of the second type game element in the virtual scene, and in the case that the moving path of the second type game element from the initial position a to the end position B is not considered, the spatial coordinates of the second type game element in the video frame are updated according to the moving coordinates a and B, and the updated video stream is transmitted to the live view terminal, so that the second type game element can be displayed in the live view terminal in the same manner and moved from the position a to the position B. In addition, in the above example, the touch medium of the user moves from the position a to the position B on the live viewer terminal interface, and the number of touch points passing through the middle is 2, for example, the present disclosure may further obtain other numbers of touch points according to actual requirements, and the present disclosure does not make special limitation on the number of touch points in the obtained touch trajectory.
Through the exemplary embodiment, the touch track of the touch medium on the live-broadcast audience terminal can be converted into the moving track of the second game element in the virtual scene, interaction between a play scene and a live-broadcast scene is realized, interactivity among live-broadcast audience users, anchor broadcasters and game elements is improved, and live-broadcast interactive experience is enhanced.
In some possible embodiments, the moving speed of the touch point in the touch track may also be obtained, and the moving speed is used as the moving speed of the second type game element in the virtual scene. Based on the method, the touch operation of the user on the live-broadcast audience terminal interface can be highly attached to the motion track and opportunity of the second game element in the virtual scene, and the game interaction experience is improved.
In some possible embodiments, the sensory data includes a parallel movement distance of the live audience terminal parallel to a side of the live audience terminal, that is, the user may control the second type of game element motion within the virtual scene by parallel live audience terminals. The parallel moving distance can be converted into the moving distance of the second game elements in the virtual scene along the horizontal direction in the virtual scene or the moving distance perpendicular to the horizontal direction according to the current game playing method, the space coordinates of the second game elements in the current video frame are moved according to the moving distance, and the updated video stream is transmitted to the live-broadcast audience terminal.
For example, when the user moves the live audience terminal along a parallel moving distance M parallel to the bottom side of the live audience terminal, that is, along the horizontal direction, the parallel moving distance M is converted into a distance M of moving the second type game element in the virtual scene along the horizontal direction in the virtual scene; for another example, when the user moves the live viewer terminal along a side perpendicular to the bottom side of the live viewer terminal, that is, along a direction perpendicular to the horizontal direction, the parallel movement distance N is converted into a movement distance of the second type game element in the virtual scene along the direction perpendicular to the horizontal direction of the virtual scene. Based on the method, the motion of the second game element in the virtual scene can be controlled in a mode of translating the live audience terminal.
Further, in some possible embodiments, when the sensing data includes a parallel moving distance of the live viewer terminal parallel to a side of the live viewer terminal, the sensing data further includes a gravity value of the user terminal. The gravity value may be converted into an acceleration of the second type of game element moving in a direction perpendicular to the horizontal direction within the virtual scene according to the current game play. Taking a doll receiving game as an example, if the user terminal moves the live audience terminal in the direction vertical to the horizontal direction, the receiving container moves in the virtual scene in the direction vertical to the horizontal direction of the virtual scene, wherein the moving acceleration is determined according to the gravity value of the live audience terminal. Based on the method, the motion state parameters of the second game elements in the virtual scene can be adjusted according to different gravity values of the live audience terminals, so that the playing method is enriched, and the interestingness is improved.
In step S140, when the motion state parameter of the game element reaches a preset condition, a game event corresponding to the game element is triggered.
In an exemplary embodiment of the present disclosure, the preset condition is determined according to different current game plays, and when the motion state parameter of the game element reaches the preset condition, a corresponding game event may be triggered.
In some possible embodiments, when the first type game element and the second type game element coincide with each other in the space coordinate in the virtual scene, the game event corresponding to the game element is triggered. That is, in the virtual scene, the first type game element and the second type game element are "collided", and in this case, the process can be performed as follows:
firstly, determining a collision parameter based on a current game playing method according to at least one of the current movement speed, the movement acceleration and the number of game elements with overlapped space coordinates of a first type of game elements and a second type of game elements when the space coordinates of the first type of game elements and the second type of game elements are overlapped; wherein, the collision parameters include but are not limited to collision angle and force.
Optionally, the corresponding relationship between the different movement speed intervals, the movement acceleration intervals and the collision strength is preset in the current game playing method, so that after the data of each game element are obtained, the collision strength is determined based on the corresponding relationship. For example, the greater the velocity at "impact" the greater the impact force, e.g., the same velocity at "impact", and the greater the acceleration the greater the corresponding impact force. In addition, the collision strength can be further optimized according to the number of game elements with overlapped space coordinates, for example, the collision strength is further increased when the number of game elements with collision is increased.
According to the method, when the first-class game elements and the second-class game elements are overlapped in space coordinates, at least one of the current movement speed and the movement acceleration of each game element and the number of the game elements with overlapped space coordinates is determined to be a collision parameter, so that different game events can be triggered according to different collision parameters of the game elements when the game events occur, the game event types triggered by games are enriched, and the live game playing method is enriched.
Further, different types of game events are triggered according to the collision parameters, for example, different degrees of explosion of game elements can be generated according to different collision strength and angles. Based on this, can trigger multiple type game event according to different collision parameter, can not increase the interactive variety of recreation because of the poor taste that certain game event brought frequently appears, improve user experience.
In some possible embodiments, when the position coordinates of the game elements of the first type coincide with the target limb position coordinates in the anchor patch within the virtual scene, the game events corresponding to the game elements are triggered.
The method comprises the steps that target limb position coordinates in a virtual scene are preset in a current game playing method, namely, after a current game is triggered, the target limb position coordinates of a game anchor in the virtual scene are detected in real time, when a first-class game element is overlapped with the target limb position coordinates, a game event corresponding to the game element is triggered, for example, when the first-class game element is detected to be overlapped with a key point (namely, the target limb position coordinates) of a head which is live in the virtual scene, the game event corresponding to the first-class game element is triggered.
Optionally, the type of the game event can be determined according to the type of the target limb, that is, the same first-type game element coincides with the position coordinates of different types of target limbs, so that different types of game events can be triggered, for example, the same bomb element, and a special effect that the bomb is exploded and heart-shaped elements are distributed around the bomb element is triggered by the coincidence with the position coordinates of the head part; while coinciding with the shoulder coordinate position triggers a special effect of exploding and having crying expressions around it, these are, of course, merely exemplary and may be set according to a specific game play, which is not specifically limited by the present disclosure. Based on the method, the live broadcast playing method can be enriched, and the interactive game experience is improved.
It should be noted that the game events corresponding to the game elements include, but are not limited to, changing motion state parameters (such as speed, acceleration, etc.), motion trajectories, shapes, volumes, colors, special effects, and game element types of the game elements, for example, when a first type of game element "collides" with a head of a main player in a doll receiving game, the motion trajectories of the first type of game element are changed, the motion speed is increased, and the element volumes are reduced; when the bomb is caught by the receiving container (coordinate coincidence), the bomb changes to a special explosive effect in the virtual scene, the receiving container is replaced by a broken model, and the like. The present disclosure sets different trigger game events according to different game playing methods, which is not particularly limited.
In an exemplary embodiment of the present disclosure, after the current game is ended, a game result (e.g., pop-up bonus popup window or the like) may be presented at the live viewer terminal to prompt the user that the game interaction is ended.
In addition, in the exemplary embodiment of the present disclosure, after receiving the game ending instruction sent by the live viewer terminal, the virtual scene is restored to the normal mode and all game elements are hidden, and accordingly, the live interface of the live viewer terminal is also restored to the normal state (similar to the existing live interface).
The live broadcast room based on the virtual broadcasting can construct a virtual scene corresponding to the current game playing method, render the acquired anchor image in the virtual scene, and generate game elements required by the current game in the virtual scene, thereby enriching the live broadcast form of the game; the method has the advantages that game element movement in the virtual scene is controlled according to sensing data corresponding to the live audience terminals, interaction between the play scene and the live scene is generated, game events are triggered, interaction among the anchor terminals, the live audience terminals and live game content can be achieved based on control over the game elements, live game interaction is enhanced, and users can obtain immersive interaction experience.
In an exemplary embodiment of the present disclosure, a game live control apparatus is also provided. Referring to fig. 4, the live game control apparatus 400 may include a virtual scene construction module 410, a game element generation module 420, a motion control module 430, and an event trigger module 440. In particular, the amount of the solvent to be used,
a virtual scene constructing module 410, configured to receive an instruction to start a current game play, construct a virtual scene corresponding to a current game, and render the acquired anchor image in the virtual scene;
a game element generating module 420, configured to pre-generate game elements required by a current game in a virtual scene according to a current game playing method;
the motion control module 430 is configured to receive sensing data of a live viewer terminal, and control motion of game elements in the virtual scene according to the sensing data;
the event triggering module 440 is configured to trigger a game event corresponding to the game element when the motion state parameter of the game element reaches a preset condition.
Since each functional module of the live game control device in the exemplary embodiment of the present disclosure is the same as that in the embodiment of the live game control method, it is not described herein again.
It should be noted that although several modules or units of the live game control device are mentioned in the above detailed description, such division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
In addition, in the exemplary embodiments of the present disclosure, a computer storage medium capable of implementing the above method is also provided. On which a program product capable of implementing the above-described method of the present specification is stored. In some possible embodiments, aspects of the present disclosure may also be implemented in the form of a program product comprising program code for causing a terminal device to perform the steps according to various exemplary embodiments of the present disclosure described in the "exemplary methods" section above of this specification, when the program product is run on the terminal device.
Referring to fig. 5, a program product 500 for implementing the above method according to an exemplary embodiment of the present disclosure is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present disclosure is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
In addition, in an exemplary embodiment of the present disclosure, an electronic device capable of implementing the above method is also provided. As will be appreciated by one skilled in the art, aspects of the present disclosure may be embodied as a system, method or program product. Accordingly, various aspects of the present disclosure may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
An electronic device 600 according to such an embodiment of the present disclosure is described below with reference to fig. 6. The electronic device 600 shown in fig. 6 is only an example and should not bring any limitations to the function and scope of use of the embodiments of the present disclosure.
As shown in fig. 6, the electronic device 600 is embodied in the form of a general purpose computing device. The components of the electronic device 600 may include, but are not limited to: the at least one processing unit 610, the at least one memory unit 620, a bus 630 connecting different system components (including the memory unit 620 and the processing unit 610), and a display unit 640.
Wherein the storage unit stores program code that is executable by the processing unit 610 to cause the processing unit 610 to perform steps according to various exemplary embodiments of the present disclosure as described in the above section "exemplary methods" of this specification.
The storage unit 620 may include readable media in the form of volatile memory units, such as a random access memory unit (RAM)621 and/or a cache memory unit 622, and may further include a read only memory unit (ROM) 623.
The storage unit 620 may also include a program/utility 624 having a set (at least one) of program modules 625, such program modules 625 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 630 may be one or more of several types of bus structures, including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 600 may also communicate with one or more external devices 700 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 600, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 600 to communicate with one or more other computing devices. Such communication may occur via an input/output (I/O) interface 650. Also, the electronic device 600 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the Internet) via the network adapter 660. As shown, the network adapter 660 communicates with the other modules of the electronic device 600 over the bus 630. It should be appreciated that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the electronic device 600, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a terminal device, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
Furthermore, the above-described figures are merely schematic illustrations of processes included in methods according to exemplary embodiments of the present disclosure, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is to be limited only by the terms of the appended claims.

Claims (15)

1. A game live broadcast control method is characterized by comprising the following steps:
receiving an instruction for starting a current game playing method, constructing a virtual scene corresponding to a current game, and rendering the acquired anchor image in the virtual scene;
game elements required by the current game are pre-generated in the virtual scene according to the current game playing method;
receiving sensing data of a live audience terminal, and controlling the motion of game elements in the virtual scene according to the sensing data;
and when the motion state parameters of the game elements reach preset conditions, triggering game events corresponding to the game elements.
2. The method of claim 1, wherein pre-generating game elements required for the current game within the virtual scene according to the current game play comprises:
and generating a template according to preset elements in the current game playing method, and generating a first type game element controlled by a preset playing method logic and a second type game element moving according to sensing data corresponding to the live broadcast audience terminal.
3. The method of claim 2, wherein the first type of game elements controlled by the preset play logic have initial motion state parameters; the method further comprises the following steps:
updating the position coordinates of the first type game elements in the virtual scene frame by frame in the process that the first type game elements move according to the initial motion state parameters;
and transmitting the updated video stream to the live audience terminal.
4. The method of claim 2, wherein the sensory data includes a tilt angle of the live viewer terminal relative to a preset reference;
controlling the motion of a second type of game elements in the virtual scene according to the received sensing data of the live audience terminal, wherein the motion control method comprises the following steps:
converting the inclination angle into the moving distance of a second type game element in the virtual scene in each direction according to the current game playing method;
and moving the space coordinates of the second game elements in the current video frame according to the moving distance, and transmitting the updated video stream to the live audience terminal.
5. The method of claim 2, wherein the sensory data comprises a touch trajectory of a touch medium on a live viewer terminal interface;
controlling the motion of a second type of game elements in the virtual scene according to the received sensing data of the live audience terminal, wherein the motion control method comprises the following steps:
converting the coordinates of each touch point in the touch track into the moving coordinates of a second type of game elements in the virtual scene according to the current game playing method;
and updating the space coordinates of the second game elements in each video frame according to the moving coordinates, and transmitting the updated video stream to the live audience terminal.
6. The method of claim 5, further comprising:
and acquiring the moving speed of a touch point in the touch track, and taking the moving speed as the moving speed of a second type of game elements in the virtual scene.
7. The method of claim 2, wherein the sensory data includes a parallel travel distance of the live viewer terminal parallel to a side of the live viewer terminal;
the controlling the motion of the second type of game elements in the virtual scene according to the received sensing data of the live audience terminal comprises the following steps:
converting the parallel movement distance into a movement distance of a second type of game element motion in the virtual scene along a horizontal direction or a movement distance perpendicular to the horizontal direction in the virtual scene according to the current game playing method;
and moving the space coordinates of the second game elements in the current video frame according to the moving distance, and transmitting the updated video stream to the live audience terminal.
8. The method according to claim 2, wherein the triggering a game event corresponding to the game element when the motion state parameter of the game element reaches a preset condition includes:
when the spatial coordinates of the first game element and the second game element in the virtual scene are coincident, triggering a game event corresponding to the game element.
9. The method of claim 8, wherein triggering a game event corresponding to the game element when the first type game element and the second type game element coincide with each other in the spatial coordinates of the virtual scene comprises:
determining collision parameters according to at least one of the current motion speed and the motion acceleration of each game element and the number of the game elements with overlapped space coordinates when the first game element and the second game element are overlapped in space coordinates;
triggering different types of game events according to the collision parameters;
wherein, the collision parameters at least comprise collision angle and force.
10. The method according to claim 2, wherein the triggering a game event corresponding to the game element when the motion state parameter of the game element reaches a preset condition includes:
and when the position coordinates of the first type game elements and/or the second type game elements are coincident with the target limb position coordinates in the anchor patch in the virtual scene, triggering game events corresponding to the game elements.
11. The method of claim 10, wherein triggering the game event corresponding to the game element comprises:
determining the type of the game event according to the type of the target limb.
12. The method of any one of claims 1 to 11, wherein the game event corresponding to the game element comprises changing at least one of a motion state parameter, a motion track, a shape, a volume, a color, a special effect and a game element type of the game element.
13. A live game control apparatus, the apparatus comprising:
the virtual scene building module is used for receiving an instruction for starting the current game playing method, building a virtual scene corresponding to the current game, and rendering the acquired anchor image in the virtual scene;
a game element generation module, configured to pre-generate game elements required by the current game in the virtual scene according to the current game play;
the motion control module is used for receiving sensing data of a live audience terminal and controlling the motion of game elements in the virtual scene according to the sensing data;
and the event triggering module is used for triggering the game event corresponding to the game element when the motion state parameter of the game element reaches a preset condition.
14. A storage medium having stored thereon a computer program which, when executed by a processor, implements a live game control method according to any one of claims 1 to 12.
15. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the live game control method of any one of claims 1 to 12 via execution of the executable instructions.
CN202111630501.4A 2021-12-28 2021-12-28 Game live broadcast control method and device, computer storage medium and electronic equipment Pending CN114307154A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111630501.4A CN114307154A (en) 2021-12-28 2021-12-28 Game live broadcast control method and device, computer storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111630501.4A CN114307154A (en) 2021-12-28 2021-12-28 Game live broadcast control method and device, computer storage medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN114307154A true CN114307154A (en) 2022-04-12

Family

ID=81015676

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111630501.4A Pending CN114307154A (en) 2021-12-28 2021-12-28 Game live broadcast control method and device, computer storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN114307154A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115150634A (en) * 2022-07-06 2022-10-04 广州博冠信息科技有限公司 Live broadcast room information processing method and device, storage medium and electronic equipment
CN115396685A (en) * 2022-08-23 2022-11-25 广州博冠信息科技有限公司 Live broadcast interaction method and device, readable storage medium and electronic equipment

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115150634A (en) * 2022-07-06 2022-10-04 广州博冠信息科技有限公司 Live broadcast room information processing method and device, storage medium and electronic equipment
CN115396685A (en) * 2022-08-23 2022-11-25 广州博冠信息科技有限公司 Live broadcast interaction method and device, readable storage medium and electronic equipment
CN115396685B (en) * 2022-08-23 2024-03-15 广州博冠信息科技有限公司 Live interaction method and device, readable storage medium and electronic equipment

Similar Documents

Publication Publication Date Title
CN107680157B (en) Live broadcast-based interaction method, live broadcast system and electronic equipment
US11514653B1 (en) Streaming mixed-reality environments between multiple devices
WO2019153840A1 (en) Sound reproduction method and device, storage medium and electronic device
US20180290061A1 (en) Interactive control management for a live interactive video game stream
CN110465097B (en) Character vertical drawing display method and device in game, electronic equipment and storage medium
CN114307154A (en) Game live broadcast control method and device, computer storage medium and electronic equipment
US20230015409A1 (en) Information prompt method and apparatus in virtual scene, electronic device, and storage medium
CN111888759A (en) Game skill release method, data processing method and device
CN111744202A (en) Method and device for loading virtual game, storage medium and electronic device
CN112891943B (en) Lens processing method and device and readable storage medium
TWI831074B (en) Information processing methods, devices, equipments, computer-readable storage mediums, and computer program products in virtual scene
CN103760972A (en) Cross-platform augmented reality experience
KR20210105267A (en) Systems and methods for transcribing user interface elements of a game application into haptic feedback
CN111744180A (en) Method and device for loading virtual game, storage medium and electronic device
CN114466213A (en) Information synchronization method, device, computer equipment, storage medium and program product
KR20170013539A (en) Augmented reality based game system and method
WO2023029626A1 (en) Avatar interaction method and apparatus, and storage medium and electronic device
WO2023065949A1 (en) Object control method and apparatus in virtual scene, terminal device, computer-readable storage medium, and computer program product
US11623150B2 (en) Rendering method for drone game
CN115068929A (en) Game information acquisition method and device, electronic equipment and storage medium
WO2020248682A1 (en) Display device and virtual scene generation method
JP6679054B1 (en) Game device, game system, program, and game control method
CN113041616A (en) Method and device for controlling jumping display in game, electronic equipment and storage medium
WO2023030106A1 (en) Object display method and apparatus, electronic device, and storage medium
EP4306192A1 (en) Information processing device, information processing terminal, information processing method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination