CN111586465B - Operation interaction method, device and equipment for live broadcast room and storage medium - Google Patents

Operation interaction method, device and equipment for live broadcast room and storage medium Download PDF

Info

Publication number
CN111586465B
CN111586465B CN202010368000.2A CN202010368000A CN111586465B CN 111586465 B CN111586465 B CN 111586465B CN 202010368000 A CN202010368000 A CN 202010368000A CN 111586465 B CN111586465 B CN 111586465B
Authority
CN
China
Prior art keywords
live broadcast
broadcast room
dimensional
display
display area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010368000.2A
Other languages
Chinese (zh)
Other versions
CN111586465A (en
Inventor
许英俊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Cubesili Information Technology Co Ltd
Original Assignee
Guangzhou Cubesili Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Cubesili Information Technology Co Ltd filed Critical Guangzhou Cubesili Information Technology Co Ltd
Priority to CN202010368000.2A priority Critical patent/CN111586465B/en
Publication of CN111586465A publication Critical patent/CN111586465A/en
Application granted granted Critical
Publication of CN111586465B publication Critical patent/CN111586465B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/443OS processes, e.g. booting an STB, implementing a Java virtual machine in an STB or power management in an STB
    • H04N21/4438Window management, e.g. event handling following interaction with the user interface
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8146Monomedia components thereof involving graphical data, e.g. 3D object, 2D graphics

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • Computer Graphics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides an operation interaction method, device, equipment and storage medium of a live broadcast room, and relates to the field of live broadcast, wherein a three-dimensional live broadcast room model in a three-dimensional live broadcast environment is generated on a display interface, and corresponding function controls are arranged in different display areas on the three-dimensional live broadcast room model; obtaining a deflection angle of a three-dimensional live broadcast room model, determining a display area of the three-dimensional live broadcast room according to the deflection angle, and rendering on a display interface; and receiving the interactive operation input by the user, determining the function control of the display area called by the interactive operation, and executing the function corresponding to the function control. According to the technical scheme, the interactive operation in the three-dimensional live broadcast room environment is enriched, the interactive efficiency of the live broadcast room is improved, and the user experience is improved.

Description

Operation interaction method, device and equipment for live broadcast room and storage medium
Technical Field
The application relates to the field of live broadcasting, in particular to an operation interaction method, device, equipment and storage medium for a live broadcasting room.
Background
The current network live broadcast mode can be carried out live broadcast by one or more anchor broadcasts, audience users watch the live broadcast, and the audience users can communicate and interact with the anchor broadcasts through voice, video, characters and pictures. When a viewer user logs in a live application program and selects an anchor program to be watched from a list interface of a home page of a live broadcast room, the viewer enters the live broadcast room to watch live video.
Due to the limitation of the size of the display screen of the terminal device, in the related live broadcast technology, the display interface of the terminal device can only realize limited interactive operation, such as entering, exiting, switching a live broadcast room and the like in a click mode. However, with the continuous abundance of functions of the live broadcast room, the operation is too complicated due to a single interactive operation mode, the interaction efficiency is low, and the requirements of users cannot be met.
Disclosure of Invention
The object of the present application is to solve at least one of the above technical drawbacks, in particular the problem of low interaction efficiency.
In a first aspect, an embodiment of the present application provides an operation interaction method for a live broadcast room, including the following steps:
starting a three-dimensional live broadcast room mode, downloading three-dimensional live broadcast room resources from a server and generating a three-dimensional live broadcast room model on a display interface, wherein different display areas on the three-dimensional live broadcast room model are provided with corresponding functional controls;
obtaining a deflection angle of the three-dimensional live broadcast room model, determining a display area of the three-dimensional live broadcast room according to the deflection angle, and rendering on the display interface;
and receiving interactive operation input by a user, determining the function control of the display area called by the interactive operation, and executing the function corresponding to the function control.
In an embodiment, the step of obtaining a deflection angle of the three-dimensional live broadcast room model, determining a display area of the three-dimensional live broadcast room according to the deflection angle, and rendering on the display interface includes:
the method comprises the steps of obtaining a rotation angle of a gyroscope inside a mobile terminal, and determining a deflection angle of a three-dimensional live broadcast room model according to the rotation angle;
determining a corresponding display area on the three-dimensional live broadcast room model according to the deflection angle based on the corresponding relation between the deflection angle and the space position of each display area of the three-dimensional live broadcast room model;
and rendering the live broadcast room picture corresponding to the display area on a display interface.
In an embodiment, the operation interaction method of the live broadcast room further includes: detecting a function control corresponding to the display area, and suspending a control icon corresponding to the function control on a display interface;
the step of receiving the interactive operation input by the user and determining the function control of the display area called by the interactive operation comprises the following steps:
and receiving interactive operation input on the control icon by a user acting on the display interface, and determining that the interactive operation calls the functional control corresponding to the control icon on the display area.
In an embodiment, the step of receiving an interactive operation input by a user, and determining a functional control of the presentation area invoked by the interactive operation includes:
receiving interactive operation input by a user on the display interface;
and acquiring the state parameters of the interactive operation acting on the display area, and calling the corresponding function control according to the state parameters.
In one embodiment, the interactive operation is a click interactive operation;
the step of obtaining the state parameter of the interactive operation acting on the display area and calling the corresponding function control according to the state parameter comprises the following steps:
acquiring a two-dimensional position parameter of the click interaction operation acting on the display interface;
converting the two-dimensional position parameter into a three-dimensional position parameter on a three-dimensional live broadcast room model corresponding to the display area based on the position corresponding relation between a display interface and the display area;
and calling a function control acting on a display area corresponding to the three-dimensional live broadcast room model according to the three-dimensional position parameter.
In one embodiment, the interaction is a sliding interaction;
the step of obtaining the state parameter of the interactive operation acting on the display area and calling the corresponding function control according to the state parameter comprises the following steps:
receiving sliding interactive operation input by a user on the display interface;
determining a sliding state parameter of the sliding interaction operation acting on a display area according to the sliding interaction operation, wherein the sliding state parameter comprises: at least one of a slide position coordinate, a slide direction, and a slide speed;
and calling the corresponding function control according to the sliding state parameter.
In an embodiment, the step of calling the corresponding function control according to the sliding state parameter includes:
calling a corresponding zooming control according to the sliding position coordinate and the sliding direction so as to zoom the display area; or the like, or a combination thereof,
and calling a corresponding switching control according to the sliding position coordinate, the sliding direction and the sliding speed so as to switch the live videos of the multiple live rooms or exit the three-dimensional live room model.
In one embodiment, the interaction is a rotational interaction;
the step of receiving the interactive operation input by the user and determining the functional control of the display area called by the interactive operation comprises the following steps:
receiving a rotation interactive operation input by a user through a rotating mobile terminal, and determining a deflection parameter of an internal gyroscope according to the rotation interactive operation;
determining a display surface of a three-dimensional live broadcast room model corresponding to the deflection parameter based on the deflection parameter, and switching to play a live broadcast video on the display surface; and rendering a display surface of the three-dimensional live broadcast room model to obtain live broadcast videos of at least two live broadcast rooms.
In a second aspect, an embodiment of the present application further provides an operation interaction device for a live broadcast room, including:
the rendering module is used for starting a three-dimensional live broadcast room mode, downloading three-dimensional live broadcast room resources from a server and generating a three-dimensional live broadcast room model on a display interface, wherein different display areas on the three-dimensional live broadcast room model are provided with corresponding functional controls;
the display module is used for obtaining a deflection angle of the three-dimensional live broadcast room model, determining a display area of the three-dimensional live broadcast room according to the deflection angle and rendering on the display interface;
and the calling module is used for receiving the interactive operation input by the user, determining the function control of the display area called by the interactive operation and executing the function corresponding to the function control.
In a third aspect, an embodiment of the present application further provides a computer device, including a memory, a processor, and a computer program stored on the memory and executable on the processor, where the processor, when executing the program, implements the steps of the operation interaction method in the live broadcast room as mentioned in any embodiment of the first aspect.
In a fourth aspect, the present application further provides a storage medium containing computer-executable instructions, which when executed by a computer processor, are configured to perform the steps of the method for operating interaction between live broadcasts as mentioned in any of the embodiments of the first aspect.
According to the operation interaction method, the operation interaction device, the operation interaction equipment and the storage medium of the live broadcast room, the three-dimensional live broadcast room model in the three-dimensional live broadcast environment is generated on the display interface, and corresponding function controls are arranged in different display areas on the three-dimensional live broadcast room model; obtaining a deflection angle of a three-dimensional live broadcast room model, determining a display area of the three-dimensional live broadcast room according to the deflection angle, and rendering on a display interface; and receiving interactive operation input by a user, and determining a function control of a display area called by the interactive operation so as to execute a function corresponding to the function control, so that the interactive operation in the environment of the three-dimensional live broadcast room is enriched, the interactive efficiency of the live broadcast room is improved, and the user experience is improved.
Additional aspects and advantages of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the present application.
Drawings
The above and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
fig. 1 is a schematic frame diagram of a webcast system according to an embodiment;
FIG. 2 is a flow diagram of an operational interaction method for a live broadcast room, provided by an embodiment;
FIG. 3 is a diagram of a three-dimensional live room model rendered by a display interface according to an embodiment;
FIG. 4 is another diagram of a three-dimensional live room model rendered by a display interface according to an embodiment;
fig. 5 is a schematic structural diagram of an operation interaction device of a live broadcast room according to an embodiment.
Detailed Description
Reference will now be made in detail to the embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the same or similar elements or elements having the same or similar functions throughout. The embodiments described below with reference to the accompanying drawings are exemplary only for explaining the present application and are not construed as limiting the present application.
As used herein, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. As used herein, the term "and/or" includes all or any element and all combinations of one or more of the associated listed items.
It will be understood by those within the art that, unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the prior art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
Before describing the method provided by the embodiment of the present application, an application scenario of the embodiment of the present application is described first. Please refer to fig. 1, where fig. 1 is a schematic diagram of a framework of a webcast system provided by an embodiment, where the system framework may include a server and clients, where the clients include one or more anchor clients (i.e., anchor, the same end) 10 and multiple viewer clients (i.e., viewer, the same end) 20, and a live platform located on the server may include multiple virtual live rooms and a server 30, where each anchor client 10 and each viewer client 20 establish communication connection with the server 30 through a wired network or a wireless network.
Generally speaking, each virtual live broadcast room correspondingly plays different live broadcast contents, the anchor broadcasts are live broadcast through the anchor client 10, and audiences select to enter a certain virtual live broadcast room through the audience client 20 to watch the anchor broadcast for live broadcast. The viewer client 20 and the anchor client 10 may enter the live platform through a live Application (APP) installed on the terminal device.
The anchor client 10 and the audience client 20 are terminal devices, such as a smart phone, a tablet computer, an e-reader, a desktop computer, or a notebook computer, and the like, which is not limited thereto. The server 30 is a background server for providing background services for the terminal device, and may be implemented by an independent server or a server cluster composed of a plurality of servers. In one embodiment, the server 30 may be a live web platform.
The operation interaction method of the live broadcast room, provided by the embodiment of the application, is suitable for a model in the three-dimensional live broadcast room, and can be used for configuring corresponding interaction operation modes in different display areas of the three-dimensional panoramic live broadcast room, quickly and simply calling related function controls to execute corresponding functions, so that the interaction modes in the three-dimensional live broadcast environment are enriched, and the interaction efficiency is improved.
The following describes in detail a technical solution of an operation interaction method of a live broadcast room provided by the present application.
Fig. 2 is a flowchart of an operation interaction method of a live broadcast room provided in an embodiment, where the operation interaction method of the live broadcast room is applied to a three-dimensional live broadcast room, and the operation interaction method of the live broadcast room may be executed in an operation interaction device of the live broadcast room, such as a client, where the client may be an audience client.
Specifically, as shown in fig. 2, the operation interaction method of the live broadcast room may include the following steps:
s110, starting a three-dimensional live broadcast room mode, downloading three-dimensional live broadcast room resources from a server, and generating a three-dimensional live broadcast room model on a display interface, wherein corresponding function controls are arranged in different display areas on the three-dimensional live broadcast room model.
Under a common condition, a user watches the anchor for live broadcasting in a common mode, the common mode is live broadcasting in a traditional live broadcasting mode, and under the common mode, the visual effect of watching the anchor live broadcasting in a two-dimensional environment of audiences is provided. In this embodiment, a switch button is provided on the two-dimensional live broadcast interface for switching the two-dimensional live broadcast interface in the normal mode to the three-dimensional live broadcast interface in the three-dimensional environment, and a switch button is also provided on the three-dimensional live broadcast interface for switching the three-dimensional live broadcast interface in the three-dimensional mode to the two-dimensional live broadcast interface in the normal mode.
In this embodiment, the switch button may be suspended on the live interface, that is, the user is in the home page of the live interface, or enters a specific live room, and the switch button is still displayed at a set position on the display screen interface, so that the user can switch the live room mode at any time. Optionally, when the user clicks the switch button, the three-dimensional live broadcast room in the three-dimensional environment is opened. And after receiving the triggering operation of the user on the switching button, the client sends a starting instruction for starting the three-dimensional live broadcast room to the server.
And after receiving a starting instruction sent by the client, the server acquires the three-dimensional live broadcast room resource in the panoramic environment corresponding to the starting instruction. The client downloads the three-dimensional live broadcast room resources from the server through the related protocol, wherein the three-dimensional live broadcast room resources can comprise live broadcast room home page information, background images, live broadcast videos, three-dimensional live broadcast room model files and the like of the three-dimensional live broadcast room.
Further, after receiving the three-dimensional live broadcast room resource issued by the server, the client analyzes the three-dimensional live broadcast room resource, and renders a three-dimensional live broadcast room model on a local display interface of the client according to the three-dimensional live broadcast room resource. Wherein the three-dimensional live room model may comprise at least two presentation surfaces. The display surface can be used for displaying the related live broadcast information, such as a three-dimensional live broadcast room list and rendering live broadcast video; after entering a specific three-dimensional live broadcast room, the display surface can also be used for displaying bullet screen information, broadcast information, public screen information or gift special effect information of a user speaking.
In this embodiment, since the three-dimensional live room model may have a plurality of display surfaces, for the convenience of understanding, the three-dimensional live room model may be analogized to a cube, and different display surfaces of the three-dimensional live room model are analogized to different surfaces (side surfaces, top surfaces, bottom surfaces, and the like) on the cube. Different display areas are displayed on different display surfaces of the three-dimensional live broadcast room model, different live broadcast contents can be displayed in the different display areas, and corresponding function controls are arranged in the different display areas.
In this embodiment, the three-dimensional live broadcast room model is provided with a plurality of functional controls, such as an enlargement control, a reduction control, a switching control, an entry and exit control, and the like. Optionally, the function controls may be arranged and fixed on a display surface of the three-dimensional live broadcast room model; the function control can be suspended on the three-dimensional live broadcast room model, and the function control is always displayed on the display interface along with the deflection movement of the three-dimensional live broadcast room, so that the user can operate the function control conveniently.
S120, obtaining a deflection angle of the three-dimensional live broadcast room model, determining a display area of the three-dimensional live broadcast room according to the deflection angle, and rendering on the display interface.
Under the mode of the three-dimensional live broadcast room, audiences can be sleeved in a three-dimensional live broadcast room model, and the user can see scenes around the three-dimensional live broadcast room model by changing the visual angle of the user. Optionally, the user may change a display area presented on the display interface by rotating the three-dimensional live broadcast room model, and render the display area on the display interface of the client to obtain a live broadcast room picture.
Optionally, the three-dimensional live broadcast room model is deflected in a clicking or sliding mode, if the right side of the display interface is clicked, the three-dimensional live broadcast room model is enabled to translate rightwards, and if the three-dimensional live broadcast room model slides clockwise on a touch screen of the display interface, the three-dimensional live broadcast room model is enabled to deflect clockwise. Optionally, the gyroscope can be deflected by starting the mobile terminal, such as a gyroscope inside a mobile phone, a tablet personal computer and the like, and rotating the mobile terminal, so that the three-dimensional live broadcast room model is deflected.
Because the three-dimensional live broadcast room model is a three-dimensional space model, live broadcast room pictures presented on a two-dimensional display interface of the client are different according to the deflection angle of the three-dimensional live broadcast room model.
For example, different contents are displayed on different display surfaces of the three-dimensional live broadcast room model, and correspondingly, different display surfaces have different function controls. Fig. 3 is a schematic diagram of a three-dimensional live broadcast room model rendered by a display interface according to an embodiment, as shown in fig. 3, the display interface generates a three-dimensional live broadcast room model 40, a live broadcast video 50 of the anchor broadcast No. 1 is displayed on a rear display surface of the three-dimensional live broadcast room model 40, a live broadcast video 60 of the anchor broadcast No. 2 and a gift icon 70 of a virtual gift are displayed on a left display surface, and a gift icon 70 of the virtual gift is displayed on a right display surface. When the three-dimensional live broadcast room model deflects zero, the visual angle of the user does not deflect, the sight of the user is over against the rear display surface of the three-dimensional live broadcast room model, and a live broadcast room picture is displayed on the display interface, as shown in fig. 3, wherein the display interface mainly displays a display area corresponding to the rear display surface, and because the rear display surface is the main display area at the moment, the rear display surface is optional, and partial controls corresponding to the rear display surface, such as functional controls like 'mute', 'play', 'zoom-in' and 'zoom-out' are displayed on the display interface.
For another example, if the three-dimensional live broadcast room model deflects leftward (e.g., translates and rotates), the visual angle of the user also deflects leftward, at this time, the view of the user directly faces the left display surface of the three-dimensional live broadcast room model, a live broadcast room picture presented on the display interface is as shown in fig. 4, fig. 4 is another schematic view of the three-dimensional live broadcast room model rendered by the display interface provided in the embodiment, and the display interface mainly renders a live broadcast video 60 of main broadcast # 2. Due to the deflection of the three-dimensional live broadcast room model 40, the left display surface is a main display area, and optionally, partial controls corresponding to the left display surface exist on the display interface, such as functional controls for "pull-down", "clear", "mute", and "play", and optionally, the types of the functional controls can be set according to the content rendered on the display surface. The content rendered by the display surface is different from the function control corresponding to the display area.
S130, receiving an interactive operation input by a user, determining a function control of the display area called by the interactive operation, and executing a function corresponding to the function control.
In an embodiment, the interactive operation may be an interactive operation acting on the display interface, and may be a touch interactive operation, where the display interface is a touch screen interface, and the user inputs the interactive operation through a finger or a stylus on the touch screen interface, such as a single-point or multi-point touch operation of clicking, long-pressing, sliding, and the like, or a non-touch operation, such as an interactive operation performed through an input device such as a mouse. The interactive operation can also be an interactive operation without acting on a display interface, such as a rotating interactive operation which acts on a three-dimensional live broadcast room model by rotating the mobile terminal.
In this embodiment, the function control corresponding to the current display area is obtained, after an interactive operation input by a user is received, an association relationship between the interactive operation and the function control corresponding to the current display area is determined, the function control corresponding to the interactive operation is called according to the association relationship, and a function corresponding to the function control is executed.
The user makes the same interactive operation, and if the display areas rendered on the current display interface are different, the called function controls may be different. For example, if the current display area is a live broadcast home page of a three-dimensional live broadcast room, and if the user performs an interactive operation of sliding upwards, the page of the list of the live broadcast home page is turned upwards; if the current display area is a live video picture of the No. 3 live broadcast room, and the user performs upward sliding interactive operation, the user quits the No. 3 live broadcast room and switches to the next live broadcast room.
In the operation interaction method for the live broadcast room provided by the embodiment, a three-dimensional live broadcast room model in a three-dimensional live broadcast environment is generated on a display interface, and corresponding function controls are arranged in different display areas on the three-dimensional live broadcast room model; obtaining a deflection angle of a three-dimensional live broadcast room model, determining a display area of the three-dimensional live broadcast room according to the deflection angle, and rendering on a display interface; and receiving interactive operation input by a user, and determining a function control of a display area called by the interactive operation so as to execute a function corresponding to the function control, thereby enriching the interactive operation in the environment of the three-dimensional live broadcast room, improving the interactive efficiency of the live broadcast room and improving the user experience.
In order to make the technical scheme of the present invention clearer and more easily understood, specific implementation processes and modes in the technical scheme are described in detail below.
In an embodiment, the obtaining of the deflection angle of the three-dimensional live broadcast room model in step S120, determining a display area of the three-dimensional live broadcast room according to the deflection angle, and rendering on the display interface may include the following steps:
s1201, obtaining a rotation angle of a gyroscope inside the mobile terminal, and determining a deflection angle of the three-dimensional live broadcast room model according to the rotation angle.
In this embodiment, a rotation angle of a gyroscope inside the mobile terminal has a corresponding relationship with a deflection angle of the three-dimensional live broadcast room model, where the rotation angle includes a direction and an angle size. Optionally, the corresponding relationship between the rotation angle of the gyroscope and the deflection angle of the three-dimensional live broadcast room model may be preset, and the rotation angle of the gyroscope and the deflection angle of the three-dimensional live broadcast room model may correspond to each other in an equivalent manner, for example, the rotation angle of the gyroscope is 10 degrees, and the deflection angle of the corresponding three-dimensional live broadcast room model is also 10 degrees; the three-dimensional live broadcast room model can also be in an equal-scale correspondence, for example, the rotation angle of a gyroscope is 10 degrees, and the deflection angle of the corresponding three-dimensional live broadcast room model is 30 degrees.
S1202, based on the corresponding relation between the deflection angle and the space position of each display area of the three-dimensional live broadcast room model, according to the display area of the three-dimensional live broadcast room model corresponding to the deflection angle.
In this embodiment, the deflection angle of the three-dimensional live broadcast room model and the spatial position of each display area of the three-dimensional live broadcast room model have a corresponding relationship, and optionally, the corresponding relationship between the deflection angle and the spatial position of each display area may be a continuous corresponding relationship or a discrete corresponding relationship. The continuous corresponding relation means that each deflection angle has a corresponding space position of different display areas, and the discrete corresponding relation means that the deflection angles in the same interval all correspond to the space position of the same display area. Optionally, an association table is established between the deflection angle and the spatial position of each display area.
And after the rotation angle of the gyroscope is obtained to obtain the deflection angle of the three-dimensional live broadcast room, finding out the display area of the three-dimensional live broadcast room model corresponding to the space position corresponding to the current deflection angle of the three-dimensional live broadcast room model according to the incidence relation table between the deflection angle and the space positions of all the display areas.
For example, the gyroscope does not rotate, that is, the mobile terminal is placed upright, and in fact, the three-dimensional live broadcast room model does not deflect, and the live broadcast picture rendered on the display interface is an upright picture of the three-dimensional live broadcast room model. When the mobile terminal deflects to the right by 30 degrees, the obtained internal gyroscope also deflects to the right by 30 degrees, and the corresponding display area is the display area observed by the three-dimensional live broadcast room model which is deflected to the right by 30 degrees in the forward direction.
And S1203, rendering the live broadcast room picture corresponding to the display area on a display interface.
In this embodiment, the rendered live broadcast room picture on the display interface is a display area of the three-dimensional live broadcast room model directly facing the current visual angle of the user. A display area of the live broadcast room is provided with a function control, and the function space can be embedded in the display area or suspended in the display area; and rendering the control icons corresponding to the function controls corresponding to the display area on the corresponding positions of the display interface, such as the right area or the top of the display interface.
In an embodiment, the operation interaction method of the live broadcast room further includes the following steps:
s1204, detecting a function control corresponding to the display area, and suspending a control icon corresponding to the function control on a display interface
Different function controls are corresponding to different live broadcast room pictures, different function controls are corresponding to different control icons, for example, the control icon of the zoom-in function control is "+" the control icon of the zoom-out function control is "-", and the like. Optionally, in an embodiment, a corresponding function control on the live view screen is detected, and the function control is displayed in a floating manner at a setting position of a display interface, for example, at the right side of the display interface. When another live broadcast room picture is rendered on the display interface, the function control corresponding to the other live broadcast room picture is displayed on the display interface in a floating mode, so that a user can conveniently perform interactive operation on the function control in the display area.
In an embodiment, the step S130 of receiving the interactive operation input by the user and determining the function control of the presentation area invoked by the interactive operation may include the following steps:
s1301, receiving interactive operation input on a control icon by a user acting on the display interface, and determining that the interactive operation calls a function control corresponding to the control icon on the display area.
In this embodiment, the control icon corresponding to the function control is displayed on the display interface, and the user may perform an interactive operation, such as clicking, sliding, long-pressing, and the like, on the control icon on the display interface through a touch mode and the like, and may also perform an interactive operation, such as clicking, sliding, long-pressing, and the like, on the control icon on the display interface through a mouse through a non-touch mode. And after the control icon receives the interactive operation, calling the corresponding function control, and executing the corresponding function on the display area.
In an embodiment, the receiving the interactive operation input by the user in step S130, and determining the function control of the display area called by the interactive operation may include receiving the interactive operation input by the user acting on the display interface, acquiring a state parameter of the interactive operation acting on the display area, and calling the corresponding function control according to the state parameter.
The state parameters comprise state parameters such as position, pressure, time, touch point number, deflection angle and the like.
In this embodiment, the function control is called through other interactive operations instead of directly acting on the interactive operation of the control icon corresponding to the function control. For example, calling the zoom-in function control by means of a continuous double-click, the effect is consistent with the effect of directly clicking the control icon "+" of the zoom-in function control.
When a user inputs interactive operation on a display interface, such as touch input interactive operation or mouse input interactive operation, the state parameters of a corresponding display area on the display interface are detected to change, such as the change of position coordinates of the interactive operation, the time and pressure of long press, the sliding track and the like. And obtaining the called function control according to the state parameters, and triggering the corresponding function control for exiting the current live broadcasting room if the application time and the application size of the pressure of the display interface are detected to meet the set conditions.
In an embodiment, if the interactive operation is a click interactive operation, the step S1302 of obtaining the state parameter of the interactive operation acting on the display area includes:
s3101, acquiring two-dimensional position parameters acted on the display interface by the click interaction operation.
The click interaction operation may be a single-point touch interaction operation, a double-point touch interaction operation, a multi-point touch interaction operation, or the like, and may also be a single-click interaction operation, a double-click interaction operation, or the like. In the click interaction operation, the click interaction parameters may include a click location parameter, a click number parameter, and the like.
In this embodiment, each position coordinate of the display interface is defined according to the resolution corresponding to the display interface, and a two-dimensional position parameter of one or more click touch points acted on the display interface by a click interaction operation is obtained. In another embodiment, the number of clicks of the click interaction operation in a preset time period may also be obtained.
S3102, based on the position corresponding relation between the display interface and the display area, the two-dimensional position parameters are converted into three-dimensional position parameters on a three-dimensional live broadcast room model corresponding to the display area.
In this embodiment, the presentation area is located on a three-dimensional live room model, and the position acting on the presentation area is represented by three-dimensional position coordinates.
When a user clicks on a screen of a display interface, the two-dimensional position parameter coordinates of the screen need to be converted into three-dimensional position parameters on a three-dimensional live broadcast room model. The two-dimensional position parameters of the display interface and the three-dimensional position parameters on the three-dimensional live broadcast room model have corresponding relations, and the corresponding relations between the two-dimensional position parameters and the three-dimensional position parameters can be realized through the prior technologies such as a viewport transformation inverse matrix, a projection transformation inverse matrix, a model transformation inverse matrix and the like.
S3103, calling a function control acting on a display area corresponding to the three-dimensional live broadcast room model according to the three-dimensional position parameters.
In this embodiment, a display surface where a display area corresponding to the three-dimensional live broadcast room model is located is determined according to the three-dimensional position parameter. For example, the three-dimensional position parameter has three coordinate values X, Y and Z, and optionally, it is determined which display plane belongs to currently through the calculated click coordinate (X, Y, Z), for example, when X =1, the click position is located on the right display plane of the three-dimensional live broadcast room model, when X = -1, the click position is located on the left display plane of the three-dimensional live broadcast room model, when Y =1, the click position is located on the upper display plane of the three-dimensional live broadcast room model, when Y = -1, the click position is located on the lower display plane of the three-dimensional live broadcast room model, when Z =1, the click position is located on the rear display plane of the three-dimensional live broadcast room model, and when Z = -1, the click position is located on the front display plane of the three-dimensional live broadcast room model.
After determining which display surface the click position is located on, further determining the specific position of the click position on the specific display surface according to the other two coordinate values (equivalent to two-dimensional coordinates) on the specific display surface, thereby determining the function control on the corresponding display area.
In an embodiment, if the interactive operation is a sliding interactive operation, the step S1302 of obtaining a state parameter of the interactive operation acting on the display area includes:
s3201, receiving sliding interactive operation input by a user on the display interface.
Optionally, the user performs a sliding interaction operation on the display interface of the touch screen through a finger or a touch pen, or the user inputs a sliding interaction operation on the display interface by long-pressing a mouse, such as sliding left, sliding right, sliding up, sliding down, sliding clockwise, sliding counterclockwise, and the like.
S3202, determining a sliding state parameter of the sliding interaction operation acting on a display area according to the sliding interaction operation, wherein the sliding state parameter comprises: at least one of a slide position coordinate, a slide direction, and a slide speed.
In the embodiment, a sliding track of the sliding interaction operation acting on the display interface is obtained, and the sliding state parameter is extracted according to the sliding track. Optionally, a plurality of two-dimensional position coordinates on the sliding track are extracted according to a preset time interval, and the two-dimensional position coordinates on the display interface are converted into three-dimensional position coordinates on the three-dimensional live broadcast room model. The sliding direction and sliding speed, etc. are determined from these position coordinates.
And S3203, calling the corresponding function control according to the sliding state parameter.
In this embodiment, the combination of different sliding state parameters corresponds to different function controls, and if the sliding speed is the same but the sliding direction is different, the combination can correspond to different function controls, for example, a single point slides up slowly and moves up a three-dimensional live broadcasting room correspondingly, and a single point slides down slowly and moves down the three-dimensional live broadcasting room correspondingly; for another example, the sliding directions are the same, but the sliding speeds are different, and the method can correspond to different function controls, for example, a single point slides leftwards quickly and correspondingly switches to the next three-dimensional live broadcast room, and a single point slides leftwards slowly and correspondingly moves leftwards to the three-dimensional live broadcast room; the sliding direction and speed are the same, and the sliding positions are different, and the sliding device can also correspond to different function controls.
Optionally, in an embodiment, a corresponding zooming control is called according to the sliding position coordinate and the sliding direction, so as to zoom the display area.
In this embodiment, the two-point backward sliding calls the zoom-out function control, and the two-point backward sliding calls the zoom-in function control. When the position coordinates of the two points move oppositely, the distance between the two fingers of the user is reduced, the lens is pulled backwards, the visual field of the user on the display interface is enlarged, the display area is reduced, and the user can see live videos on a plurality of display surfaces on the display interface simultaneously. In an embodiment, it may be further determined whether the zoom-out factor has reached a boundary condition, and when the display area is zoomed out to a certain degree and reaches the boundary condition, the zoom-out is not continued, so as to prevent a user with a too small display area from being unable to normally watch the live video.
Similarly, when the position coordinates of the two points move backwards, the distance between the two fingers of the user is increased, the lens is pushed forwards, the visual field of the user on the display interface is reduced, and the live video picture is enlarged. In an embodiment, it may be further determined whether the magnification has reached a boundary condition, and when the display area is enlarged to a certain degree and reaches the boundary condition, the display area is not enlarged any more, so as to prevent the display area being too large and being unable to display a complete picture of the live video from affecting a user to watch the live video normally.
Optionally, in an embodiment, a corresponding switching control is called according to the sliding position coordinate, the sliding direction, and the sliding speed, so as to switch live videos of multiple live broadcast rooms or exit from the three-dimensional live broadcast room model.
In this embodiment, a plurality of live videos are displayed on different display surfaces of a three-dimensional live broadcast room model, and by selecting a certain live video, the certain live video is continuously slid to a target display area of a target display surface, so that the selected live video is dragged to the target display area from a previous display position, and the position switching of the videos in the plurality of live broadcast rooms is realized. For example, a first live broadcast room is arranged on a first display surface, a second live broadcast room is arranged on a second display surface, and when the first live broadcast room is dragged to the second display surface, the second live broadcast room can automatically move to the first display surface, so that the positions of the first live broadcast room and the second live broadcast room can be exchanged.
Optionally, the current live video can be switched to play the next hidden live video through sliding interaction operation, and the current live video room or the three-dimensional live video room model can be quitted through sliding interaction operation.
In an embodiment, if the interactive operation is a rotation interactive operation, the step of determining the function control of the presentation area invoked by the interactive operation, which is received in step S130 and input by the user, includes:
s3301, receiving rotation interactive operation input by a user through a rotation mobile terminal, and determining a deflection parameter of an internal gyroscope according to the rotation interactive operation.
In this embodiment, the gyroscope of the mobile terminal is turned on, and when the user rotates the mobile terminal, the gyroscope inside the mobile terminal also rotates, and the three-dimensional live broadcast room model generated on the display interface of the mobile terminal also rotates accordingly. The rotation angle of the mobile terminal has an association relation with the deflection parameters of the internal gyroscope, and the deflection parameters comprise the deflection angle, the deflection direction and the like of the gyroscope.
S3302, determining a display surface of the three-dimensional live broadcast room model corresponding to the deflection parameter based on the deflection parameter, and switching to play a live broadcast video on the display surface; and rendering a display surface of the three-dimensional live broadcast room model to obtain live broadcast videos of at least two live broadcast rooms.
In this embodiment, the current visual angle of the user is determined by obtaining a deflection parameter of a gyroscope built in the mobile terminal, and if the gyroscope deflects by 30 degrees to the left, it is equivalent to that the user view angle deflects by 30 degrees to the left. Each display surface in the three-dimensional live broadcast room model has one-to-one correspondence with the visual angle of the user, and the appointed display surface corresponding to the space position over against which the user is currently positioned can be determined according to the current visual angle.
For example, if the deflection parameter is zero, that is, no deflection occurs, it is determined that the current visual angle is right opposite to the rear display surface of the three-dimensional live broadcast room model; and if the deflection parameter is that the current visual angle is right opposite to the left display surface of the three-dimensional live broadcast room model, deflecting the current visual angle by 90 degrees leftwards.
And after the display surface opposite to the current visual angle is determined, switching to play the live video opposite to the current visual angle. Optionally, the playing of the live video on the display surface directly opposite to the previous visual angle is stopped, for example, only the low-quality live video picture is played without playing the audio, or the playing of the live video picture and the audio is completely stopped, and the playing of the high-quality video picture on the display surface directly opposite to the current visual angle is started, and the audio is played at the same time.
Optionally, when the mobile terminal is rotated, after a certain display surface is determined according to the deflection parameter of the internal gyroscope, the display surface is highlighted to remind the user of the display surface corresponding to the current rotation angle.
The following describes in detail a related embodiment of the operation interaction means of the live broadcast.
Fig. 5 is a schematic structural diagram of an operation interaction apparatus in a live broadcast room according to an embodiment, where the operation interaction apparatus in the live broadcast room may be implemented in a computer device, such as an operation interaction device in the live broadcast room, and further, the operation interaction device in the live broadcast room may be a client, such as an audience client. In this embodiment, an example of an operation interaction device in which a viewer client is a live broadcast room is described.
Specifically, as shown in fig. 5, the operation interaction apparatus 100 of the live broadcast room includes: a rendering module 110, a presentation module 120, and a calling module.
The rendering module 110 is configured to start a three-dimensional live broadcast room mode, download three-dimensional live broadcast room resources from a server, and generate a three-dimensional live broadcast room model on a display interface, where different display areas on the three-dimensional live broadcast room model are provided with corresponding function controls;
the display module 120 is configured to obtain a deflection angle of the three-dimensional live broadcast room model, determine a display area of the three-dimensional live broadcast room according to the deflection angle, and render the display area on the display interface;
the invoking module 130 is configured to receive an interactive operation input by a user, determine a function control of the display area invoked by the interactive operation, and execute a function corresponding to the function control.
In the operation interaction device for the live broadcast room provided in this embodiment, a three-dimensional live broadcast room model in a three-dimensional live broadcast environment is generated on a display interface through the rendering module 110, and corresponding function controls are arranged in different display areas on the three-dimensional live broadcast room model; the display module 120 obtains a deflection angle of the three-dimensional live broadcast room model, determines a display area of the three-dimensional live broadcast room according to the deflection angle, and renders the display area on a display interface; the calling module 130 receives the interactive operation input by the user, and determines the function control of the display area called by the interactive operation to execute the function corresponding to the function control, so that the interactive operation in the three-dimensional live broadcast room environment is enriched, the interactive efficiency of the live broadcast room is improved, and the user experience is improved.
In one embodiment, the display module 120 includes: the device comprises a deflection angle determining unit, a display area determining unit and a live broadcast room picture rendering unit;
the device comprises a deflection angle determining unit, a three-dimensional live broadcast room model determining unit and a control unit, wherein the deflection angle determining unit is used for acquiring a rotation angle of a gyroscope inside the mobile terminal and determining a deflection angle of the three-dimensional live broadcast room model according to the rotation angle; the display area determining unit is used for determining a corresponding display area on the three-dimensional live broadcast room model according to the deflection angle based on the corresponding relation between the deflection angle and the space position of each display area of the three-dimensional live broadcast room model; and the live broadcast room picture rendering unit is used for rendering the live broadcast room picture corresponding to the display area on a display interface.
In an embodiment, the operation interaction apparatus 100 of the live broadcast room further includes: the control icon setting module is used for detecting the functional control corresponding to the display area and suspending and setting the control icon corresponding to the functional control on a display interface;
the calling module 130 includes: the first calling unit is used for receiving interactive operation input by a user on the control icon acting on the display interface and determining that the interactive operation calls the functional control corresponding to the control icon on the display area.
In one embodiment, the calling module 130 includes: and the second calling unit is used for receiving the interactive operation input by the user acting on the display interface, acquiring the state parameter of the interactive operation acting on the display area, and calling the corresponding function control according to the state parameter.
In one embodiment, the interactive operation is a click interactive operation;
the second calling unit comprises: the click calling subunit is used for acquiring a two-dimensional position parameter of the click interaction operation acting on the display interface; converting the two-dimensional position parameter into a three-dimensional position parameter on a three-dimensional live broadcast room model corresponding to the display area based on the position corresponding relation between a display interface and the display area; and calling a function control acting on a display area corresponding to the three-dimensional live broadcast room model according to the three-dimensional position parameter.
In one embodiment, the interaction is a sliding interaction;
the second calling unit comprises: the sliding calling subunit is used for receiving sliding interaction operation input by a user on the display interface; determining a sliding state parameter of the sliding interaction operation acting on a display area according to the sliding interaction operation, wherein the sliding state parameter comprises: at least one of a slide position coordinate, a slide direction, and a slide speed; and calling a corresponding function control according to the sliding state parameter.
In an embodiment, the sliding calling subunit is configured to call a corresponding zooming control according to the sliding position coordinate and the sliding direction, so as to zoom the display area; or calling a corresponding switching control according to the sliding position coordinate, the sliding direction and the sliding speed so as to switch the live videos of the multiple live rooms or exit the three-dimensional live room model.
In one embodiment, the interaction is a rotational interaction;
the calling module 130 includes: a deflection parameter determining unit and a video switching playing unit;
the system comprises a deflection parameter determining unit, a deflection parameter determining unit and a control unit, wherein the deflection parameter determining unit is used for receiving rotation interactive operation input by a user through a rotating mobile terminal and determining the deflection parameter of an internal gyroscope according to the rotation interactive operation; the video switching and playing unit is used for determining a display surface of the three-dimensional live broadcast room model corresponding to the deflection parameter based on the deflection parameter and switching and playing the live broadcast video on the display surface; and rendering a live video of at least two live rooms on the display surface of the three-dimensional live room model.
The provided operation interaction device of the live broadcast room can be used for executing the operation interaction method of the live broadcast room provided by any embodiment, and has corresponding functions and beneficial effects.
The embodiment of the present invention further provides a computer device, which includes a memory, a processor, and a computer program stored in the memory and capable of running on the processor, and when the processor executes the program, the method for operating and interacting in a live broadcast room in any of the above embodiments is implemented.
When the computer device provided by the above embodiment executes the operation interaction method of the live broadcast room provided by any of the above embodiments, the computer device has corresponding functions and beneficial effects.
Embodiments of the present invention also provide a storage medium containing computer-executable instructions, which when executed by a computer processor, are configured to perform a method for operation interaction in a live broadcast room, including:
starting a three-dimensional live broadcast room mode, downloading three-dimensional live broadcast room resources from a server and generating a three-dimensional live broadcast room model on a display interface, wherein corresponding function controls are arranged in different display areas on the three-dimensional live broadcast room model;
acquiring a deflection angle of the three-dimensional live broadcast room model, determining a display area of the three-dimensional live broadcast room according to the deflection angle, and rendering on the display interface;
and receiving interactive operation input by a user, determining the function control of the display area called by the interactive operation, and executing the function corresponding to the function control.
Of course, the storage medium containing the computer-executable instructions provided in the embodiments of the present invention is not limited to the operation of the operation interaction method in the live broadcast room described above, and may also perform related operations in the operation interaction method in the live broadcast room provided in any embodiments of the present invention, and has corresponding functions and advantages.
From the above description of the embodiments, it is obvious for those skilled in the art that the present invention can be implemented by software and necessary general hardware, and certainly can be implemented by hardware, but the former is a better embodiment in many cases. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which may be stored in a computer-readable storage medium, such as a floppy disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a FLASH Memory (FLASH), a hard disk, or an optical disk of a computer, and includes several instructions to enable a computer device (which may be a personal computer, a server, or a network device) to execute the live broadcast operation interaction method according to any embodiment of the present invention.
It should be understood that, although the steps in the flowcharts of the figures are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and may be performed in other orders unless explicitly stated herein. Moreover, at least a portion of the steps in the flow chart of the figure may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, which are not necessarily performed in sequence, but may be performed alternately or alternately with other steps or at least a portion of the sub-steps or stages of other steps.
The foregoing is only a partial embodiment of the present application, and it should be noted that, for those skilled in the art, several modifications and decorations can be made without departing from the principle of the present application, and these modifications and decorations should also be regarded as the protection scope of the present application.

Claims (10)

1. An operation interaction method of a live broadcast room is characterized by comprising the following steps:
starting a three-dimensional live broadcast room mode, downloading three-dimensional live broadcast room resources from a server and generating a three-dimensional live broadcast room model comprising at least two display surfaces on a display interface; corresponding function controls are arranged in different display areas on the three-dimensional live broadcast room model;
acquiring a deflection angle of the three-dimensional live broadcast room model, determining the deflection angle of the three-dimensional live broadcast room model according to the deflection angle, determining a display surface, which is just opposite to the sight of a user, of the three-dimensional live broadcast room as a main display area based on the corresponding relation between the deflection angle and the spatial position of the display area of each display surface of the three-dimensional live broadcast room model, and rendering a live broadcast room picture corresponding to the main display area and part of corresponding functional controls displayed on the main display area on the display interface; wherein, a display surface displays a live broadcast room picture;
receiving an interactive operation input by a user, determining a function control of the main display area called by the interactive operation, and executing a function corresponding to the function control, wherein the function control comprises the following steps: receiving sliding interactive operation input on the display interface, and determining sliding state parameters of the sliding interactive operation acting on a display area; the slip state parameters include: at least one of a slide position coordinate, a slide direction, and a slide speed; and calling a corresponding switching control according to the sliding state parameter so as to switch the live video of the multiple live rooms between the display areas of the display surfaces or exit the three-dimensional live room model.
2. The direct broadcast room operation interaction method according to claim 1, wherein the step of obtaining the deflection angle of the three-dimensional direct broadcast room model comprises:
and acquiring a rotation angle of a gyroscope inside the mobile terminal, and determining a deflection angle of the three-dimensional live broadcast room model according to the rotation angle.
3. The operation interaction method of the live broadcast room according to claim 1, further comprising: detecting a function control corresponding to the display area, and suspending a control icon corresponding to the function control on a display interface;
the step of receiving the interactive operation input by the user and determining the function control of the display area called by the interactive operation comprises the following steps:
and receiving interactive operation input on the control icon by a user acting on the display interface, and determining that the interactive operation calls the functional control corresponding to the control icon on the display area.
4. The method for operation interaction in a live broadcast room according to claim 1, wherein the step of receiving an interactive operation input by a user and determining a function control of the presentation area invoked by the interactive operation comprises:
receiving interactive operation input by a user on the display interface;
and acquiring state parameters of the interactive operation acting on the display area, and calling a corresponding function control according to the state parameters.
5. The operation interaction method of the live broadcast room according to claim 4, wherein the interaction operation is a click interaction operation;
the step of obtaining the state parameter of the interactive operation acting on the display area and calling the corresponding function control according to the state parameter comprises the following steps:
acquiring a two-dimensional position parameter of the click interaction operation acting on the display interface;
converting the two-dimensional position parameter into a three-dimensional position parameter on a three-dimensional live broadcast room model corresponding to the display area based on the position corresponding relation between a display interface and the display area;
and calling a function control acting on a display area corresponding to the three-dimensional live broadcast room model according to the three-dimensional position parameter.
6. The operation interaction method of the live broadcast room, according to claim 1, further comprising:
and calling a corresponding zooming control according to the sliding position coordinate and the sliding direction so as to zoom the display area.
7. The operation interaction method of the live broadcast room according to claim 3, wherein the interaction operation is a rotation interaction operation;
the step of receiving the interactive operation input by the user and determining the functional control of the display area called by the interactive operation comprises the following steps:
receiving rotation interactive operation input by a user through a rotating mobile terminal, and determining deflection parameters of an internal gyroscope according to the rotation interactive operation;
determining a display surface of a three-dimensional live broadcast room model corresponding to the deflection parameter based on the deflection parameter, and switching to play a live broadcast video on the display surface; and rendering a display surface of the three-dimensional live broadcast room model to obtain live broadcast videos of at least two live broadcast rooms.
8. An operation interaction device of a live broadcast room is characterized by comprising:
the rendering module is used for starting a three-dimensional live broadcast room mode, downloading three-dimensional live broadcast room resources from a server and generating a three-dimensional live broadcast room model comprising at least two display surfaces on a display interface, wherein different display areas on the three-dimensional live broadcast room model are provided with corresponding functional controls;
the display module is used for acquiring a deflection angle of the three-dimensional live broadcast room model, determining the deflection angle of the three-dimensional live broadcast room model according to the deflection angle, determining a display surface of the three-dimensional live broadcast room, which is just opposite to a user sight line, as a main display area based on the corresponding relation between the deflection angle and the space position of the display area of each display surface of the three-dimensional live broadcast room model, and rendering a live broadcast room picture corresponding to the main display area and a part of function controls corresponding to the display on the main display area on the display interface; wherein, a display surface displays a live broadcast room picture;
the calling module is used for receiving the interactive operation input by the user, determining the function control of the main display area called by the interactive operation, and executing the function corresponding to the function control, and comprises: receiving sliding interactive operation input on the display interface, and determining sliding state parameters of the sliding interactive operation acting on a display area; the slip state parameters include: at least one of a slide position coordinate, a slide direction, and a slide speed; and calling a corresponding switching control according to the sliding state parameter so as to switch the live video of the multiple live rooms between the display areas of the display surfaces or exit the three-dimensional live room model.
9. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor, when executing the program, carries out the steps of the method of operational interaction of a live broadcast as claimed in any one of claims 1-7.
10. A storage medium containing computer-executable instructions for performing the steps of the method of operating an interaction for a live broadcast as claimed in any one of claims 1-7 when executed by a computer processor.
CN202010368000.2A 2020-04-30 2020-04-30 Operation interaction method, device and equipment for live broadcast room and storage medium Active CN111586465B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010368000.2A CN111586465B (en) 2020-04-30 2020-04-30 Operation interaction method, device and equipment for live broadcast room and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010368000.2A CN111586465B (en) 2020-04-30 2020-04-30 Operation interaction method, device and equipment for live broadcast room and storage medium

Publications (2)

Publication Number Publication Date
CN111586465A CN111586465A (en) 2020-08-25
CN111586465B true CN111586465B (en) 2022-10-04

Family

ID=72120397

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010368000.2A Active CN111586465B (en) 2020-04-30 2020-04-30 Operation interaction method, device and equipment for live broadcast room and storage medium

Country Status (1)

Country Link
CN (1) CN111586465B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108012195A (en) * 2016-11-01 2018-05-08 北京星辰美豆文化传播有限公司 A kind of live broadcasting method, device and its electronic equipment
CN108650523A (en) * 2018-05-22 2018-10-12 广州虎牙信息科技有限公司 The display of direct broadcasting room and virtual objects choosing method, server, terminal and medium
CN110413261A (en) * 2019-06-26 2019-11-05 上海哔哩哔哩科技有限公司 A kind of configuration method and equipment of direct broadcast function module
CN110850983A (en) * 2019-11-13 2020-02-28 腾讯科技(深圳)有限公司 Virtual object control method and device in video live broadcast and storage medium

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105898460A (en) * 2015-12-10 2016-08-24 乐视网信息技术(北京)股份有限公司 Method and device for adjusting panorama video play visual angle of intelligent TV
CN105828090A (en) * 2016-03-22 2016-08-03 乐视网信息技术(北京)股份有限公司 Panorama live broadcasting method and device
CN106686397A (en) * 2016-12-31 2017-05-17 北京星辰美豆文化传播有限公司 Multi-person network broadcasting method and device and electronic equipment thereof
CN108108014A (en) * 2017-11-16 2018-06-01 北京密境和风科技有限公司 A kind of methods of exhibiting, device that picture is broadcast live
CN107911737B (en) * 2017-11-28 2020-06-19 腾讯科技(深圳)有限公司 Media content display method and device, computing equipment and storage medium
CN110087128A (en) * 2019-04-30 2019-08-02 广州虎牙信息科技有限公司 Living broadcast interactive method, living broadcast interactive device and live streaming equipment
CN111050189B (en) * 2019-12-31 2022-06-14 成都酷狗创业孵化器管理有限公司 Live broadcast method, device, equipment and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108012195A (en) * 2016-11-01 2018-05-08 北京星辰美豆文化传播有限公司 A kind of live broadcasting method, device and its electronic equipment
CN108650523A (en) * 2018-05-22 2018-10-12 广州虎牙信息科技有限公司 The display of direct broadcasting room and virtual objects choosing method, server, terminal and medium
CN110413261A (en) * 2019-06-26 2019-11-05 上海哔哩哔哩科技有限公司 A kind of configuration method and equipment of direct broadcast function module
CN110850983A (en) * 2019-11-13 2020-02-28 腾讯科技(深圳)有限公司 Virtual object control method and device in video live broadcast and storage medium

Also Published As

Publication number Publication date
CN111586465A (en) 2020-08-25

Similar Documents

Publication Publication Date Title
WO2020248640A1 (en) Display device
US8885057B2 (en) Performing camera control using a remote control device
JP7111288B2 (en) Video processing method, apparatus and storage medium
US20130155171A1 (en) Providing User Input Having a Plurality of Data Types Using a Remote Control Device
US8922615B2 (en) Customizing input to a videoconference using a remote control device
US20130154923A1 (en) Performing Searching for a List of Entries Using a Remote Control Device
CN110784735A (en) Live broadcast method and device, mobile terminal, computer equipment and storage medium
CN112073798B (en) Data transmission method and equipment
CN111556357B (en) Method, device and equipment for playing live video and storage medium
JP2020527883A5 (en)
US9531981B2 (en) Customized mute in a videoconference based on context
CN111277890A (en) Method for acquiring virtual gift and method for generating three-dimensional panoramic live broadcast room
CN113014939A (en) Display device and playing method
CN111246270A (en) Method, device, equipment and storage medium for displaying bullet screen
WO2019092590A1 (en) User interaction in a communication system with the aid of multiple live streaming of augmented reality data
CN109656463A (en) The generation method of individual character expression, apparatus and system
CN111586465B (en) Operation interaction method, device and equipment for live broadcast room and storage medium
CN115379277B (en) VR panoramic video playing method and system based on IPTV service
WO2022170449A1 (en) Method and device for displaying picture window, terminal and storage medium
CN113938633B (en) Video call processing method and display device
WO2020248682A1 (en) Display device and virtual scene generation method
CN115250357A (en) Terminal device, video processing method and electronic device
CN114286077A (en) Virtual reality equipment and VR scene image display method
CN112947783A (en) Display device
US20130155172A1 (en) User Interface for a Display Using a Simple Remote Control Device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20210108

Address after: 511442 3108, 79 Wanbo 2nd Road, Nancun Town, Panyu District, Guangzhou City, Guangdong Province

Applicant after: GUANGZHOU CUBESILI INFORMATION TECHNOLOGY Co.,Ltd.

Address before: 29th floor, building B-1, Wanda Plaza, Wanbo business district, Nancun Town, Panyu District, Guangzhou City, Guangdong Province

Applicant before: GUANGZHOU HUADUO NETWORK TECHNOLOGY Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant