CN113453035A - Live broadcasting method based on augmented reality, related device and storage medium - Google Patents

Live broadcasting method based on augmented reality, related device and storage medium Download PDF

Info

Publication number
CN113453035A
CN113453035A CN202110763856.4A CN202110763856A CN113453035A CN 113453035 A CN113453035 A CN 113453035A CN 202110763856 A CN202110763856 A CN 202110763856A CN 113453035 A CN113453035 A CN 113453035A
Authority
CN
China
Prior art keywords
augmented reality
data
live broadcast
terminal
live
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110763856.4A
Other languages
Chinese (zh)
Inventor
孙红亮
王子彬
朱赟
李炳泽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Shangtang Technology Development Co Ltd
Zhejiang Sensetime Technology Development Co Ltd
Original Assignee
Zhejiang Shangtang Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Shangtang Technology Development Co Ltd filed Critical Zhejiang Shangtang Technology Development Co Ltd
Priority to CN202110763856.4A priority Critical patent/CN113453035A/en
Publication of CN113453035A publication Critical patent/CN113453035A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/4363Adapting the video or multiplex stream to a specific local network, e.g. a IEEE 1394 or Bluetooth® network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/816Monomedia components thereof involving special video data, e.g 3D video

Abstract

The application discloses a live broadcast method based on augmented reality, a related device and a storage medium, wherein the method comprises the following steps: acquiring augmented reality data of a live broadcast site; and synchronously pushing the augmented reality data to other terminal equipment for playing. According to the scheme, after the augmented reality data of the live broadcast site are acquired, the augmented reality data are pushed to other terminal equipment for live broadcast, so that the user who is not on the site can browse conveniently, the user can remotely experience the environment and AR effect of the live broadcast site, and the convenience of content transmission is improved.

Description

Live broadcasting method based on augmented reality, related device and storage medium
Technical Field
The present application relates to the field of augmented reality technologies, and in particular, to a live broadcast method based on augmented reality, a related apparatus, and a storage medium.
Background
Augmented Reality (AR) technology superimposes entity information (visual information, sound, touch, etc.) on the real world after simulation, so that a real environment and a virtual object are presented on the same screen or space in real time. In recent years, the application field of the AR device is becoming wider and wider, so that the AR device plays an important role in life, work and entertainment, and the optimization of the effect of the augmented reality scene presented by the AR device becomes more and more important.
In the prior art, a user often needs to go to a site where the AR device is located, and then can experience an augmented reality scene presented by the AR device, the experience mode is single, and the user is not favorable for remotely experiencing the site environment and the AR effect.
Disclosure of Invention
The application mainly solves the technical problem of providing a live broadcasting method based on augmented reality, a related device and a storage medium.
In order to solve the above technical problem, a first technical solution adopted by the present application is to provide a live broadcast method based on augmented reality, including: acquiring augmented reality data of a live broadcast site; and synchronously pushing the augmented reality data to other terminal equipment for playing.
Therefore, after the augmented reality data of the live broadcast site are obtained, the augmented reality data are pushed to other terminal equipment to be played, and a user can remotely experience the environment and the AR effect of the live broadcast site.
The step of synchronously pushing the augmented reality data to other terminal equipment for playing comprises the following steps: and synchronously pushing the augmented reality data to other terminal equipment for playing by using a stream pushing mode.
Therefore, the enhanced display data can be synchronously pushed to other terminal equipment for playing through stream pushing, so that users who are not on site can browse conveniently, and the convenience of content spreading is improved.
The step of synchronously pushing the augmented reality data to other terminal equipment for playing comprises the following steps: the augmented reality data are pushed to other terminal equipment to be played in a wireless connection mode; the wireless connection mode comprises any one of 2G, 3G, 4G, 5G networks and WIFI.
Therefore, when the wireless connection mode includes any one of 2G, 3G, 4G, 5G network and WIFI, the augmented reality data can be pushed rapidly, so that the remote user can experience the AR scene in real time.
The method for acquiring the augmented reality data of the live broadcast site comprises the following steps: acquiring a real scene image of a live broadcast site through an augmented reality AR vehicle; and obtaining augmented reality data corresponding to the real scene image based on the real scene image.
Therefore, through the augmented reality AR vehicle, live scene images of the live broadcast site can be collected in real time, so that augmented reality data including augmented information are generated based on the live scene images, and the real scene is augmented.
Wherein, gather live on-the-spot real scene image's step through augmented reality AR car, include: shooting a real scene image of a current frame in real time by the augmented reality AR vehicle when the augmented reality AR vehicle runs; the step of obtaining augmented reality data corresponding to the real scene image based on the real scene image includes: and performing augmented reality processing on the real scene image of the current frame to obtain augmented reality data of the live broadcast site.
Therefore, the real scene image of the current frame is shot in real time through the AR vehicle, the scene view where the AR vehicle is located at present can be obtained in real time, the real scene image obtained in real time is subjected to augmented reality processing, the reality scene can be enhanced, and a remote user can be helped to know the scene environment in real time.
The other terminal equipment comprises any one or more of an AR vehicle, a computer, a mobile phone terminal and an intelligent display screen.
Therefore, the augmented reality data are pushed to any one or more terminal devices of the augmented reality display AR vehicle, the computer, the mobile phone terminal and the intelligent display screen, so that the channel of content propagation can be improved, and more remote experience site landscapes and AR effects can be conveniently achieved.
The augmented reality data is obtained by adding a target display special effect to a live scene image of a live scene.
Therefore, the target display special effect is added to the live scene image of the live broadcast site, the real scene can be enhanced, the reality and diversity of scene display are improved, and therefore a user is helped to better identify and understand the real scene.
Before the step of synchronously pushing the augmented reality data to other terminal devices for playing, the method further comprises the following steps: receiving live broadcast requests sent by other terminal equipment; the step of synchronously pushing the augmented reality data to other terminal equipment for playing comprises the following steps: and synchronously pushing the augmented reality data to other terminal equipment based on the live broadcast request.
Therefore, the terminals needing to be pushed can be timely obtained by receiving the live broadcast requests sent by other terminal devices, and the augmented reality data can be pushed to the terminals in real time, so that a remote user can feel the field environment and the AR effect in real time.
In order to solve the above technical problem, a second technical solution adopted by the present application is to provide a live broadcast method based on augmented reality, including: receiving augmented reality data pushed by a data acquisition terminal; the augmented reality data are obtained after the data acquisition terminal acquires a display scene of a live broadcast site; and playing the augmented reality data.
Therefore, by receiving the augmented reality data pushed by the data acquisition terminal, the live environment and the AR effect of the data acquisition terminal can be observed in a live mode under the condition of being far away from the target scene, and the feeling of being personally on the scene can be experienced.
The augmented reality data is obtained by adding a target display special effect to a live scene image of a live scene through a data acquisition terminal.
Therefore, the target display special effect is added to the live scene image of the live broadcast site, the real scene can be enhanced, the reality and diversity of scene display are improved, and therefore a user is helped to better identify and understand the real scene.
Wherein, before the step of receiving the augmented reality data that data acquisition terminal pushed, still include: sending a live broadcast request to a data acquisition terminal; the step of receiving augmented reality data pushed by a data acquisition terminal comprises the following steps: receiving augmented reality data sent by a data acquisition terminal based on a live broadcast request; the playing of the augmented reality data includes: and playing the augmented reality data through the data display terminal.
Therefore, the live broadcast request is actively sent to the data acquisition terminal, the data acquisition terminal can rapidly push the augmented reality data, the interactivity of the user is improved, the augmented reality data sent by the data acquisition terminal based on the live broadcast request is received, the augmented reality data are played through the data display terminal, the user can participate in the field environment where the data acquisition terminal is located in real time at a first visual angle, and the user is immersed in the environment.
In order to solve the above technical problem, a third technical solution adopted in the present application is to provide a data acquisition terminal, including: the acquisition module is used for acquiring augmented reality data of a live broadcast site; and the pushing module is used for synchronously pushing the augmented reality data to other terminal equipment for playing.
In order to solve the above technical problem, a fourth technical solution adopted by the present application is to provide a data display terminal, including: the receiving module is used for receiving augmented reality data pushed by the data acquisition terminal; the augmented reality data are obtained after the data acquisition terminal acquires a display scene of a live broadcast site; and the display module is used for playing the augmented reality data.
In order to solve the above technical problem, a fifth technical solution adopted by the present application is to provide an electronic device, including: a memory for storing program data, wherein when the stored program data is executed, the steps in the augmented reality-based live broadcast method corresponding to the data acquisition terminal or the data display terminal are realized; and the processor is used for executing the program instructions stored in the memory so as to realize the steps in the augmented reality-based live broadcast method corresponding to the data acquisition terminal or the data display terminal.
In order to solve the above technical problem, a sixth technical solution adopted by the present application is to provide a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the steps in the augmented reality-based live broadcasting method are implemented as in any one of the above.
According to the scheme, after the augmented reality data of the live broadcast site are acquired, the augmented reality data are pushed to other terminal equipment for live broadcast, so that the user who is not on the site can browse conveniently, the user can remotely experience the environment and AR effect of the live broadcast site, and the convenience of content transmission is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic structural diagram of an embodiment of an augmented reality-based live broadcast system of the present application;
fig. 2 is a schematic flowchart of a first embodiment of an augmented reality-based live broadcasting method according to the present application;
FIG. 3 is a flowchart illustrating an embodiment of the augmented reality-based live broadcasting method of FIG. 2;
fig. 4 is a flowchart illustrating a second embodiment of the augmented reality-based live broadcasting method according to the present application;
fig. 5 is a schematic flowchart of a third embodiment of the augmented reality-based live broadcasting method according to the present application;
fig. 6 is a schematic flowchart of a fourth embodiment of the augmented reality-based live broadcasting method according to the present application;
FIG. 7 is a schematic structural diagram of an embodiment of a data acquisition terminal according to the present application;
FIG. 8 is a schematic structural diagram of an embodiment of a data display terminal according to the present application;
FIG. 9 is a schematic diagram of an embodiment of an electronic device;
FIG. 10 is a schematic structural diagram of an embodiment of a computer-readable storage medium according to the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terminology used in the embodiments of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the examples of this application and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise, the "plural" includes at least two in general, but does not exclude the presence of at least one.
It should be understood that the term "and/or" as used herein is merely one type of association that describes an associated object, meaning that three relationships may exist, e.g., a and/or B may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship.
It should be understood that the terms "comprises," "comprising," or any other variation thereof, as used herein, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
In the prior art, the user often needs to go to the scene where the AR device is located, and the augmented reality scene that the AR device appears can be experienced, and the experience mode is more single, is unfavorable for the long-range on-the-spot environment of experience of user and AR effect, for example, the AR car fuses the AR technology into the car, and along with the traveling of AR car, the user can experience on-the-spot environment and AR effect, however, experience AR car function and AR effect at present and all need the user to put in the on-the-spot AR car, be unfavorable for not on-the-spot user to browse.
Based on the situation, the application provides a live broadcast method based on augmented reality, a related device and a storage medium, and can solve the problem of single experience mode.
The application firstly provides a live broadcast system based on augmented reality.
Specifically, please refer to fig. 1, fig. 1 is a schematic structural diagram of an embodiment of a live broadcast system based on augmented reality according to the present application. As shown in fig. 1, in the present embodiment, the augmented reality-based live broadcast system 10 includes a data acquisition terminal 101 and a data display terminal 102 that are connected to each other.
The data acquisition terminal 101 is configured to acquire augmented reality data of a live broadcast site, and the data display terminal 102 is configured to receive and display the augmented reality data sent by the data acquisition terminal 101.
Specifically, please refer to fig. 2, where fig. 2 is a schematic flowchart of a first embodiment of the augmented reality-based live broadcasting method according to the present application. As shown in fig. 1, in this embodiment, the main execution body of the method is a data acquisition terminal, and the method includes:
s21: and acquiring augmented reality data of the live broadcast site.
In this embodiment, the data acquisition terminal acquires a live scene image of a live broadcast site, and obtains augmented reality data corresponding to the live scene image based on the live scene image.
Wherein, the data acquisition terminal is AR equipment, and AR equipment can be augmented reality AR car, AR intelligence glasses, AR cell-phone, also can be the electronic equipment that has the augmented reality function wantonly.
In this embodiment, the augmented reality data is obtained by adding a target display special effect to a real scene image of a live broadcast site.
The target display special effect can be a display special effect matched with a target building, a display special effect matched with animals and plants, a season special effect, a virtual character special effect, a special object special effect and the like.
It can be understood that the real scene image of the live broadcast site can be collected in real time through the data collection terminal, so that augmented reality data including augmented information is generated based on the real scene image, and the real scene is augmented.
S22: and synchronously pushing the augmented reality data to other terminal equipment for playing.
In this embodiment, the data acquisition terminal in the live broadcast site pushes the augmented reality data to other terminal devices not in the live broadcast site in a stream pushing manner.
The other terminal equipment comprises any one or more of an AR vehicle, a computer, a mobile phone terminal and an intelligent display screen.
In a specific implementation scenario, a data acquisition terminal in a live broadcast site pushes augmented reality data to other terminal devices for live broadcast in a stream pushing mode through live broadcast software.
The plug flow refers to a process of transmitting the content packaged in the acquisition stage to a server, namely a process of transmitting a field video signal to a network.
The live broadcast software can be Youtube live broadcast, WeChat live broadcast and the like.
Understandably, the online mainstream live broadcast software is utilized to push the stream, so that the channel for content propagation can be greatly improved, more users can watch the augmented reality data generated in real time, the remote experience is met, and the product display appeal is fully met.
Furthermore, the data acquisition terminal pushes the augmented reality data to other terminal equipment for playing in a wireless connection mode. The wireless connection mode comprises any one of 2G, 3G, 4G, 5G networks and WIFI.
Understandably, 2G, 3G, 4G and 5G networks have the characteristics of high speed, low time delay and large bandwidth, can meet the requirements of a high-definition AR live system, can quickly push augmented reality data, and enable remote users to experience AR scenes in real time.
According to the scheme, after the augmented reality data of the live broadcast site are acquired through the data acquisition terminal, the augmented reality data are pushed to other terminal devices to be played, so that a user who is not on the site can browse, the user can remotely experience the environment and AR effect of the live broadcast site, and the convenience of content propagation is improved.
Referring to fig. 3, fig. 3 is a flowchart illustrating an embodiment of the augmented reality-based live broadcasting method in fig. 2. As shown in fig. 3, in the present embodiment, taking an augmented reality AR vehicle as an example, the method includes:
s31: and acquiring a real scene image of a live broadcast site through an augmented reality AR vehicle.
In this embodiment, the real scene image of the current frame is captured in real time by the augmented reality AR car when the augmented reality AR car is running.
Wherein, augmented reality AR car indicates the car that has the augmented reality function.
In this embodiment, the real scene image may be an image of a display scene acquired by the augmented reality AR vehicle in real time, or the real scene image may be an image of a display scene acquired by the augmented reality AR vehicle after the operator triggers the shooting operation.
In a specific implementation scene, the live broadcast site comprises an outdoor scenic spot, and the augmented reality AR vehicle collects real scene images of the scenic spot in real time in the operation process, so that a remote user can know the scenic spot through the collected real scene images when the remote user does not go to the scenic spot.
Understandably, the augmented reality AR vehicle acquires real scene images of a live broadcast site in real time, so that the site landscape where the augmented reality AR vehicle is located can be acquired in real time, and a remote user can know the site environment conveniently.
S32: and obtaining augmented reality data corresponding to the real scene image based on the real scene image.
In this embodiment, augmented reality processing is performed on a real scene image of a current frame to obtain augmented reality data of a live broadcast site.
S33: and synchronously pushing the augmented reality data to other terminal equipment for playing.
In a specific implementation scene, an augmented reality AR car operation is in outdoor scenic spot, another augmented reality AR car carries out static show in the exhibition room, the augmented reality AR car of operation can give the augmented reality AR car that is in the exhibition room through same shared network or the synchronous propelling movement of same live broadcast software with the augmented reality data that acquire in real time, and increase a reality data through the augmented reality AR car broadcast that is in the exhibition room, thereby let spectator who is located the exhibition room can the remote site environment and the AR effect of experiencing outdoor scenic spot, reach personally on-the-spot effect.
According to the scheme, after the augmented reality data of the live broadcast site is acquired through the augmented reality AR vehicle, the augmented reality data is pushed to other terminal devices to be live broadcast, so that the user who is not on the site can browse, the environment and the AR effect of the live broadcast site can be remotely experienced by the user, and the convenience of content propagation is improved.
Referring to fig. 4, fig. 4 is a flowchart illustrating a second embodiment of the augmented reality-based live broadcasting method according to the present application. As shown in fig. 4, in the present embodiment, the method includes:
s41: and acquiring augmented reality data of the live broadcast site.
S42: and receiving live broadcast requests sent by other terminal equipment.
In the embodiment, the other terminal devices and the data acquisition terminal running on site use the same live broadcast software, and the data acquisition terminal receives live broadcast requests sent by the other terminal devices on the live broadcast software; or the other terminal equipment and the data acquisition terminal use the same shared network, and the data acquisition terminal receives the live broadcast request sent by the other terminal equipment through the same shared network.
The data acquisition terminal comprises any one or more of an AR vehicle, a computer, a mobile phone terminal and an intelligent display screen, and the other terminal equipment comprises any one or more of the AR vehicle, the computer, the mobile phone terminal and the intelligent display screen.
S43: and synchronously pushing the augmented reality data to other terminal equipment based on the live broadcast request.
In this embodiment, the data acquisition terminal pushes the acquired augmented reality data to other terminal devices based on the received live broadcast request.
In a specific implementation scene, an augmented reality AR car operation is in outdoor scenic spot, another augmented reality AR car carries out static show in the exhibition room, the augmented reality AR car that is in the exhibition room sends the live request to the augmented reality AR car of operation through same shared network, the augmented reality AR car of operation receives the live request after, can send the augmented reality data that acquire in real time for the augmented reality AR car that is in the exhibition room through same shared network, and play augmented reality data through the augmented reality AR car that is in the exhibition room, thereby let the spectator that is located the exhibition room can long-rangely experience outdoor scenic spot environment and AR effect, reach the effect of being personally on the spot.
According to the scheme, after the data acquisition terminal acquires the augmented reality data of the live broadcast site, the data acquisition terminal can acquire the terminals needing to be pushed in time by receiving live broadcast requests sent by other terminal devices, and pushes the augmented reality data to the terminals in real time to play, so that users who are not on site browse, the users can remotely experience the environment and AR effect of the live broadcast site, and the convenience of content propagation is improved.
Referring to fig. 5, fig. 5 is a schematic flowchart illustrating a third embodiment of a live broadcast method based on augmented reality according to the present application. As shown in fig. 5, in the present embodiment, the method includes:
s51: receiving augmented reality data pushed by a data acquisition terminal; the augmented reality data are obtained after the data acquisition terminal acquires a display scene of a live broadcast site.
In this embodiment, the augmented reality data is obtained by adding the target display special effect to a real scene image of a live broadcast site by the data acquisition terminal.
The data acquisition terminal can be an augmented reality AR vehicle running on a live broadcast site, and can also be an AR mobile phone on the live broadcast site or any electronic equipment with an augmented reality function.
Understandably, the target display special effect is added into a live scene image of a live broadcast site, so that the real scene can be enhanced, the reality and diversity of scene display are improved, and a user is helped to better identify and understand the real scene.
S52: and playing the augmented reality data.
In this embodiment, augmented reality data is played by the data display terminal.
Understandably, the augmented reality data pushed by the data acquisition terminal is received, and the data display terminal can observe the on-site environment and the AR effect of the data acquisition terminal through live broadcasting under the condition of being far away from a target scene, so that the experience of being personally on the scene is realized.
Referring to fig. 6, fig. 6 is a schematic flowchart illustrating a fourth embodiment of the augmented reality-based live broadcasting method according to the present application. As shown in fig. 6, in the present embodiment, the main execution subject of the method is a data display terminal, and the method includes:
s61: and sending a live broadcast request to the data acquisition terminal.
In this embodiment, the data display terminal sends a live broadcast request to the data acquisition terminal, so that the data acquisition terminal sends augmented reality data based on the live broadcast request.
Specifically, the data display terminal and the data acquisition terminal use the same live broadcast software, and the data display terminal sends a live broadcast request to the data acquisition terminal on the live broadcast software; or the data display terminal and the data acquisition terminal use the same shared network, and the data display terminal sends a live broadcast request to the data acquisition terminal through the same shared network.
The data acquisition terminal can be an augmented reality AR vehicle running on a live broadcast site, an AR mobile phone on the live broadcast site or any electronic equipment with an augmented reality function, and the data acquisition terminal is not limited by the application.
S62: and receiving augmented reality data sent by the data acquisition terminal based on the live broadcast request.
S63: and playing the augmented reality data through the data display terminal.
In a specific implementation scene, data acquisition terminal is the augmented reality AR car of operation in outdoor scenic spot, data display terminal is the computer, data display terminal and data acquisition terminal use same shared network, the user sends the live request to the augmented reality AR car on the computer, and receive the augmented reality AR car and based on the augmented reality data that the live request sent, play augmented display data through the computer, can observe the site environment and the AR effect that the augmented reality AR car is located, thereby experience the sensation of being personally on the scene.
According to the scheme, the live request is sent to the data acquisition terminal actively, the data display terminal can timely receive the augmented reality data sent by the data acquisition terminal based on the live request, the augmented reality data are played, the environment and the AR effect of the user who is not on site can be enabled to remotely experience the live site, and the convenience of content transmission is improved.
Correspondingly, the application provides a related device of the live broadcast method based on the augmented reality.
Referring to fig. 7, fig. 7 is a schematic structural diagram of an embodiment of a data acquisition terminal according to the present application. As shown in fig. 7, in the present embodiment, the data acquisition terminal 70 includes an acquisition module 71 and a push module 72.
The obtaining module 71 is configured to obtain augmented reality data of a live broadcast site.
The pushing module 72 is configured to synchronously push the augmented reality data to other terminal devices for playing.
The process of obtaining and pushing please refer to the related text descriptions in steps S21-S22, steps S31-S33, and steps S41-S43, which are not described herein again.
According to the scheme, after the augmented reality data of the live broadcast site are acquired through the acquisition module 71, the augmented reality data are pushed to other terminal devices through the pushing module 72 to be live broadcast, so that users who are not on the site browse, the users can remotely experience the environment and AR effect of the live broadcast site, and the convenience of content propagation is improved.
Referring to fig. 8, fig. 8 is a schematic structural diagram of an embodiment of a data display terminal according to the present application. As shown in fig. 8, in the present embodiment, the data display terminal 80 includes a receiving module 81 and a display module 82.
The receiving module 81 is configured to receive augmented reality data pushed by a data acquisition terminal; the augmented reality data are obtained after the data acquisition terminal acquires a display scene of a live broadcast site.
The display module 82 is used for playing the augmented reality data.
The receiving and displaying process is described in steps S51-S52 and steps S61-S63, and will not be described herein again.
Above-mentioned scheme receives the augmented reality data that data acquisition terminal pushed through receiving module 81 to play augmented reality data through display module 82, can make the long-range live environment of experience of user and AR effect not on the scene, promoted the convenience that the content was propagated.
Referring to fig. 9, fig. 9 is a schematic structural diagram of an embodiment of an electronic device according to the present application. As shown in fig. 9, in this embodiment, the electronic device 90 includes a memory 91 and a processor 92.
In this embodiment, the memory 91 is used for storing program data, and when the program data is executed, the steps in the augmented reality-based live broadcast method corresponding to the data acquisition terminal or the data display terminal in any of the above method embodiments can be implemented; the processor 92 is configured to execute the program instructions stored in the memory 91 to implement the steps in the augmented reality based live broadcasting method corresponding to the data acquisition terminal or the data display terminal in any of the above-described method embodiments.
In particular, the processor 92 is adapted to control itself and the memory 91 to implement the steps of any of the above-described method embodiments. The processor 92 may also be referred to as a CPU (Central Processing Unit). The processor 92 may be an integrated circuit chip having signal processing capabilities. The Processor 92 may also be a general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. In addition, the processor 92 may be commonly implemented by a plurality of integrated circuit chips.
Above-mentioned scheme, processor 92 is through obtaining the on-the-spot augmented reality data of live, can broadcast augmented reality data propelling movement for data display terminal, so that not browse at on-the-spot user, can make the on-the-spot environment of user remote experience and AR effect, the convenience that the content was propagated has been promoted, perhaps, processor 92 receives the augmented reality data of data acquisition terminal propelling movement, play augmented reality data, can make the on-the-spot environment and AR effect of the on-the-spot user remote experience, the convenience that the content was propagated has been promoted.
Accordingly, the present application provides a computer-readable storage medium.
Referring to fig. 10, fig. 10 is a schematic structural diagram of an embodiment of a computer-readable storage medium according to the present application.
The computer-readable storage medium 100 includes a computer program 1001 stored on the computer-readable storage medium 100, and when the computer program 1001 is executed by the processor, the steps in any of the method embodiments described above or the steps correspondingly executed by the relevant apparatus in the method embodiments described above are implemented.
In particular, the integrated unit, if implemented in the form of a software functional unit and sold or used as a separate product, may be stored in a computer readable storage medium 100. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a computer-readable storage medium 100 and includes several instructions for causing a computer device (which may be a personal computer, a server, a network device, or the like) or a processor (processor) to execute all or part of the steps of the method of the embodiments of the present application. And the aforementioned computer-readable storage medium 100 includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
In the several embodiments provided in the present application, it should be understood that the disclosed method and apparatus may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of a module or a unit is merely a logical division, and an actual implementation may have another division, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some interfaces, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, a network device, or the like) or a processor (processor) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only for the purpose of illustrating embodiments of the present application and is not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the specification and the drawings of the present application or are directly or indirectly applied to other related technical fields, are also included in the scope of the present application.

Claims (15)

1. A live broadcasting method based on augmented reality is characterized by comprising the following steps:
acquiring augmented reality data of a live broadcast site;
and synchronously pushing the augmented reality data to other terminal equipment for playing.
2. The augmented reality-based live broadcasting method according to claim 1, wherein the step of synchronously pushing the augmented reality data to other terminal devices for playing comprises:
and pushing the augmented reality data to other terminal equipment for playing in a stream pushing mode.
3. The augmented reality-based live broadcasting method according to claim 1, wherein the step of synchronously pushing the augmented reality data to other terminal devices for playing comprises:
synchronously pushing the augmented reality data to other terminal equipment for playing in a wireless connection mode; the wireless connection mode comprises any one of 2G, 3G, 4G, 5G networks and WIFI.
4. The augmented reality-based live broadcasting method according to claim 1, wherein the step of acquiring augmented reality data of a live broadcast site includes:
acquiring a real scene image of the live broadcast scene through an Augmented Reality (AR) vehicle;
and obtaining the augmented reality data corresponding to the real scene image based on the real scene image.
5. The augmented reality-based live broadcast method according to claim 4, wherein the step of capturing the real scene image of the live broadcast site by an Augmented Reality (AR) vehicle comprises:
shooting a real scene image of a current frame in real time by the augmented reality AR vehicle when the augmented reality AR vehicle runs;
the step of obtaining the augmented reality data corresponding to the real scene image based on the real scene image includes:
and performing augmented reality processing on the real scene image of the current frame to obtain augmented reality data of the live broadcast site.
6. The augmented reality-based live broadcast method according to claim 1, wherein the other terminal devices comprise any one or more of an augmented display (AR) car, a computer, a mobile phone terminal and an intelligent display screen.
7. The augmented reality-based live broadcasting method according to any one of claims 1 to 6, wherein the augmented reality data is obtained by adding a target display special effect to a real scene image of the live broadcasting scene.
8. The augmented reality-based live broadcasting method according to claim 1, wherein before the step of synchronously pushing the augmented reality data to other terminal devices for playing, the method further comprises:
receiving the live broadcast request sent by the other terminal equipment;
the step of synchronously pushing the augmented reality data to other terminal equipment for playing comprises the following steps:
and synchronously pushing the augmented reality data to other terminal equipment based on the live broadcast request.
9. A live broadcasting method based on augmented reality is characterized by comprising the following steps:
receiving augmented reality data pushed by a data acquisition terminal; the augmented reality data are obtained after the data acquisition terminal acquires a display scene of a live broadcast site;
and playing the augmented reality data.
10. The augmented reality live broadcasting method according to claim 9, wherein the augmented reality data is obtained by the data acquisition terminal adding a target display special effect to a real scene image of the live broadcasting site.
11. The augmented reality live broadcasting method according to claim 9, further comprising, before the step of receiving augmented reality data pushed by the data acquisition terminal:
sending a live broadcast request to the data acquisition terminal;
the step of receiving augmented reality data pushed by the data acquisition terminal comprises:
receiving the augmented reality data sent by the data acquisition terminal based on the live broadcast request;
the playing the augmented reality data comprises:
and playing the augmented reality data through a data display terminal.
12. A data acquisition terminal, comprising:
the acquisition module is used for acquiring augmented reality data of a live broadcast site;
and the pushing module is used for synchronously pushing the augmented reality data to other terminal equipment for playing.
13. A data display terminal, comprising:
the receiving module is used for receiving augmented reality data pushed by the data acquisition terminal; the augmented reality data are obtained after the data acquisition terminal acquires a display scene of a live broadcast site;
and the display module is used for playing the augmented reality data.
14. An electronic device, comprising:
a memory for storing program data which when executed implement the steps in an augmented reality based live method as claimed in any one of claims 1 to 8 or any one of claims 9 to 11;
a processor for executing the program instructions stored by the memory to implement the steps in the augmented reality based live method of any one of claims 1 to 8 or any one of claims 9 to 11.
15. A computer-readable storage medium, having stored thereon a computer program which, when being executed by a processor, carries out the steps of the augmented reality based live method according to any one of claims 1 to 8 or any one of claims 9 to 11.
CN202110763856.4A 2021-07-06 2021-07-06 Live broadcasting method based on augmented reality, related device and storage medium Pending CN113453035A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110763856.4A CN113453035A (en) 2021-07-06 2021-07-06 Live broadcasting method based on augmented reality, related device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110763856.4A CN113453035A (en) 2021-07-06 2021-07-06 Live broadcasting method based on augmented reality, related device and storage medium

Publications (1)

Publication Number Publication Date
CN113453035A true CN113453035A (en) 2021-09-28

Family

ID=77815614

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110763856.4A Pending CN113453035A (en) 2021-07-06 2021-07-06 Live broadcasting method based on augmented reality, related device and storage medium

Country Status (1)

Country Link
CN (1) CN113453035A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114374853A (en) * 2021-11-23 2022-04-19 云南腾云信息产业有限公司 Content display method and device, computer equipment and storage medium
CN115665437A (en) * 2022-12-21 2023-01-31 深圳市易云数字科技有限责任公司 Scene customizable on-site interactive AR slow live broadcast system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140178029A1 (en) * 2012-12-26 2014-06-26 Ali Fazal Raheman Novel Augmented Reality Kiosks
CN109040766A (en) * 2018-08-27 2018-12-18 百度在线网络技术(北京)有限公司 live video processing method, device and storage medium
US20190025904A1 (en) * 2017-07-19 2019-01-24 Facebook, Inc. Systems and Methods for Incrementally Downloading Augmented-Reality Effects
CN109963163A (en) * 2017-12-26 2019-07-02 阿里巴巴集团控股有限公司 Internet video live broadcasting method, device and electronic equipment
CN110176077A (en) * 2019-05-23 2019-08-27 北京悉见科技有限公司 The method, apparatus and computer storage medium that augmented reality is taken pictures
CN110784733A (en) * 2019-11-07 2020-02-11 广州虎牙科技有限公司 Live broadcast data processing method and device, electronic equipment and readable storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140178029A1 (en) * 2012-12-26 2014-06-26 Ali Fazal Raheman Novel Augmented Reality Kiosks
US20190025904A1 (en) * 2017-07-19 2019-01-24 Facebook, Inc. Systems and Methods for Incrementally Downloading Augmented-Reality Effects
CN109963163A (en) * 2017-12-26 2019-07-02 阿里巴巴集团控股有限公司 Internet video live broadcasting method, device and electronic equipment
CN109040766A (en) * 2018-08-27 2018-12-18 百度在线网络技术(北京)有限公司 live video processing method, device and storage medium
CN110176077A (en) * 2019-05-23 2019-08-27 北京悉见科技有限公司 The method, apparatus and computer storage medium that augmented reality is taken pictures
CN110784733A (en) * 2019-11-07 2020-02-11 广州虎牙科技有限公司 Live broadcast data processing method and device, electronic equipment and readable storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114374853A (en) * 2021-11-23 2022-04-19 云南腾云信息产业有限公司 Content display method and device, computer equipment and storage medium
CN115665437A (en) * 2022-12-21 2023-01-31 深圳市易云数字科技有限责任公司 Scene customizable on-site interactive AR slow live broadcast system

Similar Documents

Publication Publication Date Title
JP6727669B2 (en) Information interaction method, device, and system
CN111629225B (en) Visual angle switching method, device and equipment for live broadcast of virtual scene and storage medium
JP6724110B2 (en) Avatar display system in virtual space, avatar display method in virtual space, computer program
CN108737882B (en) Image display method, image display device, storage medium and electronic device
US20170195650A1 (en) Method and system for multi point same screen broadcast of video
US8665374B2 (en) Interactive video insertions, and applications thereof
US20170171274A1 (en) Method and electronic device for synchronously playing multiple-cameras video
CN106303555A (en) A kind of live broadcasting method based on mixed reality, device and system
CN107864122B (en) Display method and device for live stream of main broadcast with wheat
WO2023029823A1 (en) Game picture display method and apparatus, device and storage medium
CN113453035A (en) Live broadcasting method based on augmented reality, related device and storage medium
CN108322474B (en) Virtual reality system based on shared desktop, related device and method
US20170225077A1 (en) Special video generation system for game play situation
CN108833892A (en) A kind of VR live broadcast system
CN108810600A (en) A kind of switching method of video scene, client and server
CN113630614A (en) Game live broadcast method, device, system, electronic equipment and readable storage medium
CN113490006A (en) Live broadcast interaction method and equipment based on bullet screen
CN103442288A (en) Method, device and system for processing of trans-equipment data contents
CN111277768A (en) Electric competition studio system
CN112492324A (en) Data processing method and system
CN112383794B (en) Live broadcast method, live broadcast system, server and computer storage medium
CN111249723B (en) Method, device, electronic equipment and storage medium for display control in game
KR20210084248A (en) Method and apparatus for providing a platform for transmitting vr contents
KR20150004494A (en) Social media platform system based on augmented reality and method of supplying social media contents based on augmented reality
KR101915065B1 (en) Live streaming system for virtual reality contents and operating method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210928

RJ01 Rejection of invention patent application after publication