CN113504867B - Live broadcast interaction method and device, storage medium and electronic equipment - Google Patents

Live broadcast interaction method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN113504867B
CN113504867B CN202110664339.1A CN202110664339A CN113504867B CN 113504867 B CN113504867 B CN 113504867B CN 202110664339 A CN202110664339 A CN 202110664339A CN 113504867 B CN113504867 B CN 113504867B
Authority
CN
China
Prior art keywords
interaction
live broadcast
area
live
position information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110664339.1A
Other languages
Chinese (zh)
Other versions
CN113504867A (en
Inventor
陈泽宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Boguan Information Technology Co Ltd
Original Assignee
Guangzhou Boguan Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Boguan Information Technology Co Ltd filed Critical Guangzhou Boguan Information Technology Co Ltd
Priority to CN202110664339.1A priority Critical patent/CN113504867B/en
Publication of CN113504867A publication Critical patent/CN113504867A/en
Application granted granted Critical
Publication of CN113504867B publication Critical patent/CN113504867B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/85Providing additional services to players
    • A63F13/86Watching games played by other players
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present disclosure provides a live broadcast interaction method, apparatus, storage medium and electronic device; relates to the technical field of network live broadcast. The method comprises the following steps: acquiring map information of a live broadcast area where a main broadcast terminal is located; generating a position information interaction area on the graphical user interface according to the map information; and responding to an operation event acting on the position information interaction area, providing a display window on the graphical user interface, and displaying video data corresponding to the position information interaction area in the display window. According to the method and the device, when the live broadcast picture of the anchor is watched, the live broadcast visual angle can be independently selected and watched through the position information interaction area, interaction with the anchor is achieved, and the live broadcast watching experience of audiences is improved.

Description

Live broadcast interaction method and device, storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of network live broadcast technologies, and in particular, to a live broadcast interaction method, a live broadcast interaction apparatus, a computer-readable storage medium, and an electronic device.
Background
With the development of network technology, live broadcast technology has emerged in recent years, video live broadcast is endowed with entertainment and social attributes, and live broadcast watching becomes a mainstream entertainment form.
In the existing live broadcast platform, an interface for real-time interaction with a main broadcast is provided for a user, and the interface can transmit a live broadcast scene of the main broadcast for the user in real time in a mode of capturing current image data of a screen or a camera. However, in the whole process of watching the live broadcast, the visual angle of the user is limited to the picture displayed by the main broadcast screen, the interaction with the main broadcast is lacked, and the participation sense of the user is reduced.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The present disclosure is directed to a live broadcast interaction method, a live broadcast interaction apparatus, a computer-readable storage medium, and an electronic device, so as to solve a problem in the prior art that a live broadcast frame is limited to a single anchor live broadcast view angle when a user watches a live broadcast.
According to a first aspect of the present disclosure, there is provided a live interaction method for providing a graphical user interface through a viewer terminal, the method comprising:
acquiring map information of a live broadcast area where a main broadcast terminal is located;
generating a position information interaction area on the graphical user interface according to the map information;
and responding to an operation event acting on the position information interaction area, providing a display window on the graphical user interface, and displaying video data corresponding to the position information interaction area in the display window.
In an exemplary embodiment of the present disclosure, the live zone includes at least one live sub-zone; the responding to the operation event acting on the position information interaction area, providing a display window on the graphical user interface, and displaying the video data corresponding to the position information interaction area in the display window, wherein the method comprises the following steps:
acquiring video data shot by a camera device arranged in each live broadcast subarea in real time;
and responding to an operation event acting on a first interaction subarea in the position information interaction area, providing a display window on the graphical user interface, and displaying video data shot by a camera device in a live broadcasting subarea corresponding to the first interaction subarea in the display window.
In an exemplary embodiment of the present disclosure, the map information of the live broadcast area includes a two-dimensional plan view of the live broadcast area, and the two-dimensional plan view includes the information of each live broadcast sub-area;
generating a location information interaction area on the graphical user interface according to the map information, comprising:
mapping each live broadcast subarea information in the two-dimensional plane diagram on the graphical user interface to generate the position information interaction area, wherein the position information interaction area comprises interaction subareas which are in one-to-one correspondence with the live broadcast subareas.
In an exemplary embodiment of the present disclosure, the map information of the live broadcast area includes image information acquired by a camera device provided in each live broadcast sub-area; generating a location information interaction area on the graphical user interface according to the map information, further comprising:
generating the interaction subarea according to the image information;
and splicing the interactive subregions to generate the position information interactive region.
In an exemplary embodiment of the present disclosure, the displaying, in the display window, video data shot by a camera in a live broadcast sub-area corresponding to the first interactive sub-area includes:
determining a target display area of the display window on the graphical user interface according to a relative position relationship between a position identifier of a anchor terminal and the first interaction sub-area;
and displaying video data shot by a camera in a live broadcast subarea corresponding to the first interactive subarea in the target display area through the display window.
In an exemplary embodiment of the present disclosure, the method further comprises:
and responding to an operation event acting on a first interaction subarea in the position information interaction area, and displaying the first interaction subarea in the position information interaction area in an identification mode.
In an exemplary embodiment of the present disclosure, the method further comprises:
and when an operation event acting on a second interaction sub-region in the position information interaction region is detected, replacing the video data corresponding to the first interaction sub-region in the display window with the video data corresponding to the second interaction sub-region.
In an exemplary embodiment of the present disclosure, the method further comprises:
and acquiring the current position information of the anchor terminal, and displaying the position identifier of the anchor terminal in the position information interaction area.
In an exemplary embodiment of the present disclosure, the map information of the live broadcast area further includes three-dimensional point cloud data of the live broadcast area; the acquiring the current position information of the anchor terminal includes:
acquiring a current shooting picture of the anchor terminal;
and matching the current shooting picture with the three-dimensional point cloud data of the live broadcast area so as to determine the current position information of the anchor terminal in the live broadcast area.
In an exemplary embodiment of the present disclosure, the method further comprises:
when an operation event acting on a third interaction subarea where the anchor terminal is located in the position information interaction area is detected, determining a target live broadcasting subarea corresponding to the third interaction subarea;
and updating the position information interaction area into a target information interaction area according to the target live broadcast subarea.
According to a second aspect of the present disclosure, there is provided a live broadcast interaction apparatus, including:
the map information acquisition module is used for acquiring the map information of the live broadcast area where the anchor terminal is located;
the interactive area generating module is used for generating a position information interactive area on the graphical user interface according to the map information;
and the video data display module is used for responding to an operation event acting on the position information interaction area, providing a display window on the graphical user interface and displaying the video data corresponding to the position information interaction area in the display window.
According to a third aspect of the present disclosure, there is provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method of any one of the above.
According to a fourth aspect of the present disclosure, there is provided an electronic device comprising: a processor; and a memory for storing executable instructions of the processor; wherein the processor is configured to perform the method of any one of the above via execution of the executable instructions.
Exemplary embodiments of the present disclosure may have some or all of the following benefits:
in the live broadcast interaction method provided by the embodiment of the disclosure, map information of a live broadcast area where a main broadcast terminal is located is obtained; generating a position information interaction area on the graphical user interface according to the map information; and responding to an operation event acting on the position information interaction area, providing a display window on the graphical user interface, and displaying video data corresponding to the position information interaction area in the display window. Spectators can independently select the live broadcast viewing angle through the position information interaction region while watching the live broadcast picture of the main broadcast, so that the interaction diversity of the live broadcast client is realized, and further the live broadcast viewing experience of spectators is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty.
Fig. 1 is a schematic diagram illustrating an exemplary system architecture to which a live interaction method and apparatus according to an embodiment of the present disclosure may be applied;
FIG. 2 illustrates a schematic structural diagram of a computer system suitable for use with the electronic device used to implement embodiments of the present disclosure;
fig. 3 schematically shows a flow diagram of a live interaction method according to an embodiment of the present disclosure;
FIG. 4 schematically illustrates a schematic diagram of a location information interaction area according to one embodiment of the present disclosure;
fig. 5 schematically shows a schematic diagram of a video data presentation according to an embodiment of the present disclosure;
fig. 6 schematically shows a flow chart of video data presentation according to an embodiment of the present disclosure;
fig. 7A and 7B in fig. 7 schematically show schematic diagrams of video data presentation according to another embodiment of the present disclosure;
fig. 8A and 8B in fig. 8 schematically show diagrams of location information interaction area updates according to an embodiment of the present disclosure;
fig. 9 schematically shows a block diagram of a live interaction device according to an embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the embodiments of the present disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, etc. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
Fig. 1 is a schematic diagram illustrating a system architecture of an exemplary application environment to which a live interaction method and apparatus according to an embodiment of the present disclosure may be applied.
As shown in fig. 1, the live system architecture 100 may include a cast terminal 101, a central control module 102, a live server 103, a viewer terminal 104, and one or more mobile camera devices 105. The anchor terminal 101 may be any of various electronic devices having a display screen, including but not limited to desktop computers, portable computers, smart phones, tablet computers, and the like. In a live broadcast system, the terminal device 101 is generally a smart phone such as a anchor phone, and the anchor phone may further be provided with a fixed camera and a display screen, and the fixed camera may further include a front camera and a rear camera. Taking a live broadcast in a secret room as an example, the central control module 102 may be a central control server that collects video data captured by all the mobile cameras 105 in the secret room and controls the startup or reset of the secret room devices. The live broadcast server 103 may be used for collecting, encoding, recording, distributing, etc. live broadcast data. The spectator terminal 104 is also typically a smart phone such as a user's phone watching a live broadcast.
In the embodiment of the present disclosure, the captured picture of the anchor terminal 101 may be sent to the live broadcast server 103 through the network, and the live broadcast server 103 may forward the captured picture of the anchor terminal 101 to the viewer terminal 104 to view the anchor live broadcast. Meanwhile, the position information of the anchor terminal 101 can also be sent to the central control module 102 through the network, the central control module 102 and the audience terminal 104 have a data interface, and the position information of the anchor terminal 101 can be transmitted through the data interface, so that the audience terminal 104 can intuitively know the position of the anchor. In addition, the central control module 102 is connected to the mobile camera device 105, the central control module 102 can control the mobile camera device 105 to shoot, and each mobile camera device 105 is an independent device for shooting, it can be understood that video data shot by the mobile camera device 105 can be transmitted to the central control module 102, and the central control module 102 can transmit the video data to the audience terminal 104 through the data interface for the audience terminal 104 to watch. According to implementation requirements, the embodiment of the present disclosure may have any number of terminal devices and servers, for example, the server 103 may be a server cluster formed by a plurality of servers.
The live interactive method provided by the embodiment of the present disclosure is generally executed by the audience terminal 104, and accordingly, the live interactive apparatus is generally disposed in the audience terminal 104. However, it is easily understood by those skilled in the art that the live broadcast interaction method provided in the embodiment of the present disclosure may also be executed by the anchor terminal 101, and may also be executed by the live broadcast server 103 and the central control module 102, and accordingly, the live broadcast interaction apparatus may also be configured to be executed by the anchor terminal 101, and may also be configured to be configured in the live broadcast server 103 and the central control module 102, which is not particularly limited in this exemplary embodiment.
FIG. 2 illustrates a schematic structural diagram of a computer system suitable for use in implementing the electronic device of an embodiment of the present disclosure.
It should be noted that the computer system 200 of the electronic device shown in fig. 2 is only an example, and should not bring any limitation to the functions and the scope of the application of the embodiments of the present disclosure.
As shown in fig. 2, the computer system 200 includes a Central Processing Unit (CPU)201 that can perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)202 or a program loaded from a storage section 208 into a Random Access Memory (RAM) 203. In the RAM 203, various programs and data necessary for system operation are also stored. The CPU 201, ROM 202, and RAM 203 are connected to each other via a bus 204. An input/output (I/O) interface 205 is also connected to bus 204.
The following components are connected to the I/O interface 205: an input portion 206 including a keyboard, a mouse, and the like; an output section 207 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage section 208 including a hard disk and the like; and a communication section 209 including a network interface card such as a LAN card, a modem, or the like. The communication section 209 performs communication processing via a network such as the internet. A drive 210 is also connected to the I/O interface 205 as needed. A removable medium 211 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 210 as necessary, so that a computer program read out therefrom is mounted into the storage section 208 as necessary.
In particular, the processes described below with reference to the flow diagrams may be implemented as computer software programs, according to embodiments of the present disclosure. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 209 and/or installed from the removable medium 211. The computer program, when executed by a Central Processing Unit (CPU)201, performs various functions defined in the methods and apparatus of the present application.
As another aspect, the present application also provides a computer-readable medium, which may be contained in the electronic device described in the above embodiments; or may exist separately without being assembled into the electronic device. The computer readable medium carries one or more programs which, when executed by one of the electronic devices, cause the electronic device to implement the method as described in the embodiments below. For example, the electronic device may implement the steps shown in fig. 3 and fig. 6, and so on.
It should be noted that the computer readable media shown in the present disclosure may be computer readable signal media or computer readable storage media or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer-readable signal medium may include a propagated data signal with computer-readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The technical solution of the embodiment of the present disclosure is explained in detail below:
currently, a live broadcast platform can provide an interface for a user to interact with a main broadcast in real time, and the interface can transmit live broadcast scenes of the main broadcast for the user in real time in a mode of capturing current image data of a screen or a camera. However, in the whole process of watching the live broadcast, the visual angle of the user is limited to the picture displayed by the main broadcast screen, the interaction with the main broadcast is lacked, and the participation sense of the user is reduced. For example, for live broadcasting in a private room, a user can only watch the main broadcasting and live broadcast by using a mobile phone, and the interaction and the participation are not existed.
Based on one or more of the above problems, the present exemplary embodiment provides a live broadcast interaction method, which may be applied to the viewer terminal 104, the anchor terminal 101, the live broadcast server 103 and the central control module 102, and this is not particularly limited in this exemplary embodiment. Referring to fig. 3, the live interaction method may include the following steps S310 to S330:
s310, obtaining map information of a live broadcast area where a main broadcast terminal is located;
s320, generating a position information interaction area on the graphical user interface according to the map information;
step S330, responding to an operation event acting on the position information interaction area, providing a display window on the graphical user interface, and displaying video data corresponding to the position information interaction area in the display window.
In the live broadcast interaction method provided by the embodiment of the disclosure, map information of a live broadcast area where a main broadcast terminal is located is obtained; generating a position information interaction area on the graphical user interface according to the map information; and responding to an operation event acting on the position information interaction area, providing a display window on the graphical user interface, and displaying video data corresponding to the position information interaction area in the display window. Spectators can independently select the live broadcast viewing angle through the position information interaction region while watching the live broadcast picture of the main broadcast, so that the interaction diversity of the live broadcast client is realized, and further the live broadcast viewing experience of spectators is improved.
The above steps of the present exemplary embodiment will be described in more detail below.
In step S310, map information of a live zone where the anchor terminal is located is acquired.
The anchor client can be an anchor live client, the client can be provided with a live application program, when the live application program is running, the live server can acquire current live data captured by the anchor client, namely when the live application program is running, the anchor client can capture a live picture displayed on a current screen and transmit the live picture to an audience client for watching the live picture through the live server. The viewer client may also have a live application installed for viewing live pictures transmitted in real time by the live server.
In an exemplary embodiment, a live broadcast in a closed room may be taken as an example, the live broadcast content in the closed room is different from a live broadcast of games, a live broadcast of gourmet, and the like, and the live broadcast in the closed room refers to a live broadcast mode in which a main broadcast participates in various playing methods and simultaneously the whole process is live broadcast to attract users to watch the live broadcast in a closed space (the closed space may be one room or a combination of multiple rooms). For example, live broadcasting in a secret room can be a main broadcasting and participate in various decryption games, clues and prompts are continuously discovered through observation and logical thinking, the user can leave from the secret room if passing through within a specified time, the whole live broadcasting process is full of unknowns and uncertainty, the user can decrypt with the main broadcasting through watching the live broadcasting, and the interest of watching the live broadcasting is increased for the user.
In this example, a graphical user interface may be provided by the viewer terminal, and a location information interaction area may be displayed on the graphical user interface of the viewer terminal while the live application of the viewer client is running. Before displaying the position information interaction area, map information of a live broadcast area where the anchor terminal is located may be acquired. The map information of the live broadcast area can include a two-dimensional plane map of the live broadcast area, image information collected by a camera device arranged in the live broadcast area and three-dimensional point cloud data of the live broadcast area.
In one example, a two-dimensional plan of a darkroom may be obtained by using a visual SLAM (Simultaneous Localization And Mapping) method, where SLAM refers to a main body carrying a specific sensor (such as a camera or a video camera), And a map or a model of an environment is created during a movement process without environment prior information. In other examples, a two-dimensional plan of the chamber may also be obtained by three-dimensional laser scanning, laser ranging sensor, etc. using a measuring device.
The live broadcast area where the anchor terminal is located may include at least one live broadcast sub-area, and the anchor may be live broadcast in any live broadcast sub-area. For example, in a backroom live broadcast, the live broadcast zone may be a backroom and the live broadcast sub-zone may be a plurality of small backrooms in the backroom. The number of the direct broadcasting sub-areas can be consistent with the number of the small backrooms, and can also be smaller than the number of the small backrooms, namely, the direct broadcasting in the backrooms can not be carried out on part of the small backrooms. The two-dimensional plan of the live broadcast area can comprise information of each live broadcast subarea, so that the distribution of each small backroom in the backroom can be visually shown to audiences through the two-dimensional plan of the backroom.
In one example, image information collected by a camera device disposed in a live broadcast area may be acquired. For example, a plurality of cameras may be provided in the live broadcast area, and at least one camera may be installed in each live broadcast sub-area of the live broadcast area. In this example, a darkroom central control system may be included in the darkroom live broadcast, and the darkroom central control system may collect video data captured by the cameras in each live broadcast subarea in the darkroom, and may also control the activation and reset of the darkroom devices, for example, may control the activation and reset functions of each camera. Therefore, the image information collected by the camera device arranged in the live broadcast area can be acquired in real time through the secret room central control system.
In one example, three-dimensional point cloud data of a backroom may be acquired. For example, three-dimensional coordinates and imagery of a chamber may be acquired using an optical ranging system, which may include stereo cameras, time-of-flight infrared cameras, and structured light devices, or intensity-based scanning techniques.
In step S320, a location information interaction area is generated on the graphical user interface according to the map information.
In this example, the location information interaction area may be a User Interface (UI) of a backroom where the live broadcast is hosted, the map UI provides an accessible and effective visualization method for representing a geographic space reality, and the map UI may further include an interaction relationship between a User and a terminal Interface.
In one example, after a two-dimensional plan of a backroom is obtained, a backroom map UI design may be performed according to the two-dimensional plan, for example, information of each live sub-area in the two-dimensional plan may be mapped on a graphical user interface of a viewer terminal to generate a location information interaction area. Specifically, a map model can be drawn in an equal scale according to the contour of each live broadcast sub-area in the two-dimensional plane map, and when the map model is displayed on a graphical user interface, a backroom map UI can be generated on the graphical user interface. The position information interaction area can comprise at least one interaction sub-area, and the interaction sub-areas correspond to the live broadcast sub-areas one by one.
In another example, after image information acquired by a camera device arranged in each live broadcast sub-area is acquired, an interactive sub-area may be generated according to the image information. In this example, the camera device may be a panoramic camera, such as a 720 degree panoramic camera. The panoramic pictures shot by the panoramic camera arranged in each live broadcast subregion can be acquired in real time, and each panoramic picture is extracted, for example, the picture region extraction can be performed on each panoramic picture from a specified view angle, the extracted picture region is subjected to overlook projection, and a two-dimensional plane projection image of each live broadcast subregion is generated. A plurality of interactive sub-regions may be generated when the two-dimensional planar projection image of each live sub-region is mapped to the graphical user interface. And finally, sequentially splicing and synthesizing the interaction subregions corresponding to the panoramic cameras according to the position distribution of the panoramic cameras in the live broadcast region to generate a position information interaction region.
Referring to fig. 4, when the live application of the user client is running, a backroom map UI 420 may be displayed in the live interface 410 displayed on the viewer terminal, and the backroom map UI 420 may be located at any position in the live interface 410, such as the lower right corner in the live interface 410 or the upper left corner in the live interface 410.
At least one interactive sub-area may be included in the backroom map UI 420 corresponding to a plurality of live sub-areas in the live area. In an example embodiment, each interactive sub-region may be set as a click hot region, the click hot region may function as a button, the click hot region refers to a click region where the button is valid, and each click hot region may represent a small secret room in which a photographing device is installed. As shown in fig. 4, 6 click zones are displayed in the backroom map UI 420, which can easily indicate that the backroom where the anchor is located includes 6 sub-areas, for example, 6 small backrooms. The shape of the click hot zone can be consistent with the shape of each small chamber in the two-dimensional plan of the chamber. In other examples, the shape of the click hot zone may be any shape, such as a circle or other shape containing the center position of each small chamber in the two-dimensional plan view of the chamber, which is not specifically limited in this example. It is noted that the backroom map UI 420 may move through the live interface 410 in response to a viewer action event, such as the backroom map UI 420 may follow the hand identifiers of fig. 4.
In an example embodiment, the location information sent by the anchor terminal may also be acquired, and the location identifier of the anchor terminal is displayed in the location information interaction area. The anchor terminal can be an anchor mobile phone, the anchor mobile phone can be used for normal live broadcasting and can also be used as a position positioning anchor point, position information (such as coordinate information) of the anchor mobile phone in a secret room can be sent to a secret room central control system, and the secret room central control system can send the real-time position information of the anchor mobile phone to audience terminals through a data interface. Correspondingly, an identifier (such as a circle identifier in fig. 4) may be defined in the tight room map UI 420 to represent the anchor and move along with the anchor in the tight room, and the identifier may be used to indicate the corresponding location of the anchor in the tight room, so that the viewer can know the current location of the anchor in the tight room in real time according to the identifier.
Illustratively, the position information of the anchor terminal can be obtained through visual positioning, wherein the visual positioning refers to calculating the position and the posture of a camera when a certain two-dimensional image is shot according to a pre-constructed 3D model. The matching relationship between the 3D point and the 2D point needs to be established so as to estimate the pose of the camera through the matching relationship. For example, video stream data in a dense room including color images, depth images and pose parameters may be acquired, an indoor three-dimensional map and an indoor two-dimensional map may be constructed from the video stream data in the room, and coordinate systems of the indoor three-dimensional map and the indoor two-dimensional map may be matched to construct a network data set. The network data set comprises a color image, a depth image, a pose parameter, an indoor three-dimensional map and an indoor two-dimensional map. Specifically, a current shooting picture of the anchor terminal can be obtained, and based on a pre-constructed network data set, the current shooting picture can be matched with the three-dimensional point cloud data of the live broadcast area to determine a coordinate position of a 2D key point of the current shooting picture in the three-dimensional point cloud data, so as to determine current position information of the anchor terminal in the live broadcast area.
In step S330, in response to an operation event acting on the position information interaction area, a display window is provided on the graphical user interface, and video data corresponding to the position information interaction area is displayed in the display window.
In an example embodiment, the central control system in the secret room can control and start the camera devices of each live broadcast subarea in the secret room, and after the camera devices are started, the corresponding live broadcast subareas can be independently shot. The secret room central control system is connected with each camera device in the secret room and can acquire video data of each live broadcast subarea in real time, and the secret room central control system can send corresponding video data to audience terminals through data interfaces according to operation events of audiences on position information interaction areas.
For example, when the viewer terminal device is a smartphone, the operation event may be a click operation or a press operation, for example, the viewer may touch and click or press the location information interaction area for interaction. When the viewer terminal device is a PC (computer) terminal, the operation event may be a click operation, for example, the viewer may click the location information interaction area through a mouse to perform interaction. When a viewer clicks a first interactive subarea in a backroom map UI, a display window can be generated on a graphical user interface of a viewer terminal in response to the clicking operation, and video data shot by a camera device in a live broadcasting subarea corresponding to the first interactive subarea is displayed in the display window. In one example, in response to a click operation on the first interaction sub-region, the first interaction sub-region may also be displayed in an identifiable manner in the backroom map UI, such as may be displayed in a highlighted form, a bolded form of an outline of the interaction sub-region, or other forms.
Referring to fig. 5, when the viewer clicks on any one of the hot click zones in the backroom map UI 420, the hot click zone may be distinctively displayed on other hot click zones, such as by highlighting them. Meanwhile, a display window 430 is generated on the live interface 410, and the video data of the backroom area corresponding to the clicked hot area is displayed through the display window. The display window 430 may be displayed on the live interface, for example, the display window 430 may cover one third of the area of the live interface, or the viewer may manually adjust the size of the display window according to actual needs, which is not limited in this example. In this example, it is convenient for the viewer to operate the display window and view video data in any area of the live broadcast area where the anchor is located, and the convenience of use of the viewer terminal is improved.
In the method, a visual interactive map interface (namely a secret room map UI) is displayed for the audience, so that the audience can quickly know the regional distribution of the secret room, and the audience can also master the current position of the anchor in the secret room in real time. In addition, the audience can interactively watch different areas in the backroom by clicking the interactive subarea in the backroom map UI, and the participation sense of the audience is increased by interactively watching live broadcast.
As shown in fig. 6, video data presentation may also be performed according to steps S610 and S620.
In step S610, a target display area of the display window on the graphical user interface is determined according to a relative positional relationship between the position identifier of the anchor terminal and the first interaction sub-area.
And responding to an operation event of the audience to the first interaction sub-region in the position information interaction region, and determining a target display region of the display window on the client interface according to a relative position relation between the position identifier of the anchor terminal in the position information interaction region and the first interaction sub-region, namely, the up-down left-right relation between the anchor and each secret room region.
In step S620, the video data captured by the camera in the live broadcast sub-area corresponding to the first interactive sub-area is displayed in the target display area through the display window.
Illustratively, the darkroom area corresponding to the first interaction subregion is an a darkroom, and as can be known from a darkroom map UI, it is assumed that the anchor is located at the upper left of the a darkroom at the current moment, that is, the a darkroom is located at the lower right of the current position of the anchor, so that it can be determined that the current target display area of the display window is located at the lower right area of the live broadcast interface, and a real-time video picture in the a darkroom can be played through the display window, and the video picture is captured by a camera device in the a darkroom in real time. In other examples, the display window may also be displayed at a fixed position of the live interface, for example, the display window may be fixedly located at a lower right corner of the live interface, so as to avoid the display window from blocking the live interface, and optimize the user experience.
In an example embodiment, the operation event may be taken as a click operation as an example, first, whether a click operation is received at any position in the backroom map UI may be detected, and when the click operation is detected, a position range of the click operation may be determined by a position sensor to determine whether the click operation acts on any position of any interactive sub-region. When an operation event acting on the second interaction sub-region is detected, the video data corresponding to the first interaction sub-region in the display window can be replaced by the video data corresponding to the second interaction sub-region. In other examples, a plurality of display windows may be correspondingly generated on a graphical user interface of the viewer terminal in response to a plurality of operation events that the viewer acts on the location information interaction area, which is not specifically limited in this example. For example, when the viewer sequentially clicks two interactive sub-regions, two display windows may be sequentially generated on the graphical user interface, and video data corresponding to each interactive sub-region may be displayed in each display window.
Referring to fig. 7, in the backroom map UI 420 shown in fig. 7A, the highlighted circle identification may represent the current location of the anchor and the highlighted square identification may represent a first interaction sub-area, such as a first click zone, corresponding to a small backroom a in the backroom. Moreover, since the small cell a is located at the lower right of the current position of the anchor, the display window 430 may be displayed at the lower right corner of the live interface 410, and the video data displayed in the display window 430 may be captured by the camera device in the small cell a in real time. In the backroom map UI 420 shown in fig. 7B, the current location of the anchor has not changed, and when the viewer clicks on a second interactive sub-region, such as a second click hot region, the second click hot region may be highlighted, and at the same time, the highlighting of the first click hot region is cancelled. The second click hot zone corresponds to a small secret room B in the secret room, the display window 430 can be displayed in the upper left corner of the live interface 410 according to the relative position of the small secret room B and the anchor, and the video data displayed in the display window 430 can be obtained by real-time shooting through a camera device in the small secret room B. In this example, according to the relative position relationship between each small secret room of the secret room and the anchor, the video data in the region can be displayed at the corresponding position in the live broadcast interface, and a coherent visual effect can be achieved when a user watches the scene change of the secret room.
According to the live broadcast interaction method, the backroom central control system can transmit anchor position information and backroom map information with the live broadcast client in real time, audiences can visually know the current backroom condition and the position of the anchor from the on-line mode, and video pictures shot by the corresponding camera device can be automatically displayed on live broadcast pictures according to the interaction of the backroom map UI on the live broadcast interface. Meanwhile, the anchor and the audience can realize the interaction between the online and the offline, and the live broadcast effect is improved.
In an example embodiment, when an operation event acting on a third interaction sub-region where a anchor terminal is located in a location information interaction region is detected, a target live broadcast sub-region corresponding to the third interaction sub-region may be determined. Similar to the generation process of the position information interaction area, the map information of the target live broadcast sub-area can be acquired, and the target information interaction area is generated according to the map information of the target live broadcast sub-area. Meanwhile, the position information interaction area can be updated to be the target information interaction area on the live broadcast interface. The target information interaction area can be used for showing the layout of the target live broadcast subarea, for example, the target information interaction area can contain the distribution of each area in the target live broadcast subarea, and can also contain the item information in the target live broadcast subarea, such as an item identifier, and audiences can know the position of an item through the item identifier so as to interact with the anchor. Referring to fig. 8, in fig. 8A, when the location identification of the anchor is located in the interactive sub-area 810, it corresponds to the anchor being located in the cell corresponding to the interactive sub-area 810 at this time. Illustratively, when the audience clicks the interaction sub-area 810, information of a corresponding small secret room can be acquired, and meanwhile, the secret room map UI can be changed, that is, the secret room map UI can be updated to the map UI of the small secret room, distribution of each area in the small secret room can be visually shown to the audience through the map UI of the small secret room, and position information of each article in the small secret room can also be shown to the audience.
In the live broadcast interaction method provided by the embodiment of the disclosure, map information of a live broadcast area where a main broadcast terminal is located is obtained; generating a position information interaction area on the graphical user interface according to the map information; and responding to an operation event acting on the position information interaction area, providing a display window on the graphical user interface, and displaying video data corresponding to the position information interaction area in the display window. Spectators can independently select the live broadcast viewing angle through the position information interaction region while watching the live broadcast picture of the main broadcast, so that the interaction diversity of the live broadcast client is realized, and further the live broadcast viewing experience of spectators is improved.
It should be noted that although the various steps of the methods of the present disclosure are depicted in the drawings in a particular order, this does not require or imply that these steps must be performed in this particular order, or that all of the depicted steps must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken into multiple step executions, etc.
Further, in the present exemplary embodiment, a live broadcast interaction apparatus is also provided. The device can be applied to a terminal device or a server. Referring to fig. 9, the live broadcast interactive apparatus 900 may include a map information obtaining module 910, an interactive area generating module 920, and a video data presentation module 930, where:
a map information obtaining module 910, configured to obtain map information of a live broadcast area where the anchor terminal is located;
an interaction area generating module 920, configured to generate a location information interaction area on the graphical user interface according to the map information;
a video data display module 930, configured to provide a display window on the graphical user interface in response to an operation event acting on the information interaction area, and display video data corresponding to the information interaction area in the display window.
In an alternative embodiment, the live zone comprises at least one live sub-zone; the video data presentation module 930 includes:
the video data acquisition module is used for acquiring video data shot by the camera device in each live broadcast subregion in real time;
and the first data display module is used for responding to an operation event acting on a first interaction subarea in the position information interaction area, providing a display window on the graphical user interface, and displaying video data shot by a camera device in a live broadcasting subarea corresponding to the first interaction subarea in the display window.
In an optional implementation manner, the map information of the live broadcast area includes a two-dimensional plan view of the live broadcast area, where the two-dimensional plan view includes the information of each live broadcast sub-area; the interaction region generation module 920 is configured to map information of each live sub region in the two-dimensional plane map on the graphical user interface, and generate the location information interaction region, where the location information interaction region includes interaction sub regions corresponding to the live sub regions one to one.
In an optional implementation manner, the map information of the live broadcast area includes image information acquired by a camera device arranged in each live broadcast sub-area; the interaction region generation module 920 further includes:
the interactive subregion generating unit is used for generating the interactive subregion according to the image information;
and the position information interaction area generating unit is used for splicing the interaction sub-areas to generate the position information interaction area.
In an alternative embodiment, the first data presentation module comprises:
a display area determining unit, configured to determine, according to a relative positional relationship between a position identifier of a anchor terminal and the first interaction sub-area, a target display area of the display window on the graphical user interface;
and the first data display unit is used for displaying the video data shot by the camera device in the live broadcast subarea corresponding to the first interactive subarea in the target display area through the display window.
In an optional implementation manner, the live interactive device 900 further includes an interactive sub-region identification display module, configured to respond to an operation event that acts on a first interactive sub-region in the location information interactive region, and identify and display the first interactive sub-region in the location information interactive region.
In an optional implementation manner, the live broadcast interaction apparatus 900 further includes a display data switching module, configured to, when an operation event acting on a second interaction sub-region in the location information interaction region is detected, replace video data corresponding to the first interaction sub-region in the display window with video data corresponding to the second interaction sub-region.
In an alternative embodiment, the live interaction device 900 further comprises:
the position information acquisition module is used for acquiring the current position information of the anchor terminal;
and the position identifier display module is used for displaying the position identifier of the anchor terminal in the position information interaction area.
In an alternative embodiment, the location information obtaining module includes:
the picture acquisition unit is used for acquiring a current shooting picture of the anchor terminal;
and the data matching unit is used for matching the current shooting picture with the three-dimensional point cloud data of the live broadcast area so as to determine the current position information of the anchor terminal in the live broadcast area.
In an alternative embodiment, the live interaction device 900 further comprises:
a target live broadcast sub-region determining module, configured to determine a target live broadcast sub-region corresponding to a third interaction sub-region in which the anchor terminal is located in the position information interaction region when an operation event acting on the third interaction sub-region is detected;
and the interactive area updating module is used for updating the position information interactive area into a target information interactive area according to the target live broadcast subarea.
The specific details of each module in the live broadcast interaction device have been described in detail in the corresponding live broadcast interaction method, and therefore are not described herein again.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
It will be understood that the present disclosure is not limited to the precise arrangements that have been described above and shown in the drawings, and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (12)

1. A live interaction method, wherein a graphical user interface is provided through a viewer terminal, the method comprising:
acquiring map information of a live broadcast area where an anchor terminal is located, wherein the live broadcast area comprises a plurality of live broadcast sub-areas;
generating a position information interaction area on the graphical user interface according to the map information, wherein the position information interaction area comprises interaction sub-areas which are in one-to-one correspondence with the live broadcast sub-areas;
acquiring video data shot by a camera device arranged in each live broadcast subarea in real time;
and responding to an operation event acting on a first interaction subarea in the position information interaction area, providing a display window on the graphical user interface, and displaying video data shot by a camera device in a live broadcasting subarea corresponding to the first interaction subarea in the display window.
2. The live interaction method as claimed in claim 1, wherein the map information of the live zone comprises a two-dimensional plan view of the live zone, the two-dimensional plan view comprising the information of the live sub-zones;
generating a location information interaction area on the graphical user interface according to the map information, comprising:
and mapping the information of each live broadcast subarea in the two-dimensional plane graph on the graphical user interface to generate the position information interaction area.
3. The live broadcast interaction method as claimed in claim 1, wherein the map information of the live broadcast area includes image information collected by a camera device arranged in each live broadcast sub-area; generating a location information interaction area on the graphical user interface according to the map information, further comprising:
generating the interaction subarea according to the image information;
and splicing the interactive subregions to generate the position information interactive region.
4. The live broadcast interaction method according to claim 1, wherein the displaying, in the display window, video data shot by a camera device in a live broadcast subarea corresponding to the first interaction subarea comprises:
determining a target display area of the display window on the graphical user interface according to a relative position relationship between a position identifier of a anchor terminal and the first interaction sub-area;
and displaying the video data shot by the camera device in the live broadcast subarea corresponding to the first interactive subarea in the target display area through the display window.
5. The live interaction method of claim 1, further comprising:
and responding to an operation event acting on a first interaction subarea in the position information interaction area, and displaying the first interaction subarea in the position information interaction area in an identification mode.
6. The live interaction method of claim 1, further comprising:
and when an operation event acting on a second interaction sub-region in the position information interaction region is detected, replacing the video data corresponding to the first interaction sub-region in the display window with the video data corresponding to the second interaction sub-region.
7. The live interaction method of claim 1, further comprising:
and acquiring the current position information of the anchor terminal, and displaying the position identifier of the anchor terminal in the position information interaction area.
8. The live interaction method of claim 7, wherein the map information of the live broadcast area comprises three-dimensional point cloud data of the live broadcast area; the acquiring the current position information of the anchor terminal includes:
acquiring a current shooting picture of the anchor terminal;
and matching the current shooting picture with the three-dimensional point cloud data of the live broadcast area to determine the current position information of the anchor terminal in the live broadcast area.
9. The live interaction method of claim 1, further comprising:
when an operation event acting on a third interaction subarea where the anchor terminal is located in the position information interaction area is detected, determining a target live broadcasting subarea corresponding to the third interaction subarea;
and updating the position information interaction area into a target information interaction area according to the target live broadcast subarea.
10. A live interaction device, comprising:
the map information acquisition module is used for acquiring map information of a live broadcast area where the anchor terminal is located, and the live broadcast area comprises a plurality of live broadcast sub-areas;
the interactive region generating module is used for generating a position information interactive region on a graphical user interface according to the map information, wherein the position information interactive region comprises interactive sub-regions which are in one-to-one correspondence with the live broadcast sub-regions;
the video data display module is used for acquiring video data shot by the camera device arranged in each live broadcast subarea in real time; and responding to an operation event acting on a first interaction subarea in the position information interaction area, providing a display window on the graphical user interface, and displaying video data shot by a camera device in a live broadcasting subarea corresponding to the first interaction subarea in the display window.
11. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the method of any one of claims 1 to 9.
12. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the method of any of claims 1-9 via execution of the executable instructions.
CN202110664339.1A 2021-06-16 2021-06-16 Live broadcast interaction method and device, storage medium and electronic equipment Active CN113504867B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110664339.1A CN113504867B (en) 2021-06-16 2021-06-16 Live broadcast interaction method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110664339.1A CN113504867B (en) 2021-06-16 2021-06-16 Live broadcast interaction method and device, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN113504867A CN113504867A (en) 2021-10-15
CN113504867B true CN113504867B (en) 2022-09-30

Family

ID=78009908

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110664339.1A Active CN113504867B (en) 2021-06-16 2021-06-16 Live broadcast interaction method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN113504867B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114051151B (en) * 2021-11-23 2023-11-28 广州博冠信息科技有限公司 Live interaction method and device, storage medium and electronic equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107105315A (en) * 2017-05-11 2017-08-29 广州华多网络科技有限公司 Live broadcasting method, the live broadcasting method of main broadcaster's client, main broadcaster's client and equipment
CN111263174A (en) * 2020-01-14 2020-06-09 广州虎牙科技有限公司 Live broadcast control method and device, electronic equipment, live broadcast system and storage medium
CN112367531A (en) * 2020-10-30 2021-02-12 腾讯科技(深圳)有限公司 Video stream display method, processing method and related equipment
CN112788358A (en) * 2020-12-31 2021-05-11 腾讯科技(深圳)有限公司 Video live broadcast method, video sending method, device and equipment for game match
CN112929687A (en) * 2021-02-05 2021-06-08 腾竞体育文化发展(上海)有限公司 Interaction method, device and equipment based on live video and storage medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108174272B (en) * 2017-12-29 2021-01-22 广州虎牙信息科技有限公司 Method and device for displaying interactive information in live broadcast, storage medium and electronic equipment
US11412313B2 (en) * 2018-05-02 2022-08-09 Twitter, Inc. Sharing timestamps for video content in a messaging platform
CN109218754A (en) * 2018-09-28 2019-01-15 武汉斗鱼网络科技有限公司 Information display method, device, equipment and medium in a kind of live streaming
CN111405343A (en) * 2020-03-18 2020-07-10 广州华多网络科技有限公司 Live broadcast interaction method and device, electronic equipment and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107105315A (en) * 2017-05-11 2017-08-29 广州华多网络科技有限公司 Live broadcasting method, the live broadcasting method of main broadcaster's client, main broadcaster's client and equipment
CN111263174A (en) * 2020-01-14 2020-06-09 广州虎牙科技有限公司 Live broadcast control method and device, electronic equipment, live broadcast system and storage medium
CN112367531A (en) * 2020-10-30 2021-02-12 腾讯科技(深圳)有限公司 Video stream display method, processing method and related equipment
CN112788358A (en) * 2020-12-31 2021-05-11 腾讯科技(深圳)有限公司 Video live broadcast method, video sending method, device and equipment for game match
CN112929687A (en) * 2021-02-05 2021-06-08 腾竞体育文化发展(上海)有限公司 Interaction method, device and equipment based on live video and storage medium

Also Published As

Publication number Publication date
CN113504867A (en) 2021-10-15

Similar Documents

Publication Publication Date Title
CN112104594B (en) Immersive interactive remote participation in-situ entertainment
US10600169B2 (en) Image processing system and image processing method
JP2024050721A (en) Information processing device, information processing method, and computer program
CN112312111A (en) Virtual image display method and device, electronic equipment and storage medium
CN112543344B (en) Live broadcast control method and device, computer readable medium and electronic equipment
CN113079364A (en) Three-dimensional display method, device, medium and electronic equipment for static object
CN111277890A (en) Method for acquiring virtual gift and method for generating three-dimensional panoramic live broadcast room
Suenaga et al. A practical implementation of free viewpoint video system for soccer games
CN114225400A (en) Bullet screen processing method and device, storage medium and electronic equipment
CN113504867B (en) Live broadcast interaction method and device, storage medium and electronic equipment
CN113377472A (en) Account login method, three-dimensional display device and server
CN114863014A (en) Fusion display method and device for three-dimensional model
CN110730340A (en) Lens transformation-based virtual auditorium display method, system and storage medium
JP6559375B1 (en) Content distribution system, content distribution method, and content distribution program
CN113412479A (en) Mixed reality display device and mixed reality display method
JP2002271694A (en) Image processing method, image processing unit, studio device, storage medium and program
CN113194329B (en) Live interaction method, device, terminal and storage medium
CN115174954A (en) Video live broadcast method and device, electronic equipment and storage medium
CN112929685B (en) Interaction method and device for VR live broadcast room, electronic device and storage medium
US20210125339A1 (en) Method and device for segmenting image, and storage medium
CN115174953A (en) Virtual event live broadcast method and system and event live broadcast server
CN110784728B (en) Image data processing method and device and computer readable storage medium
CN117197319B (en) Image generation method, device, electronic equipment and storage medium
CN112286355B (en) Interactive method and system for immersive content
CN114051151B (en) Live interaction method and device, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant