CN114466202A - Mixed reality live broadcast method and device, electronic equipment and readable storage medium - Google Patents

Mixed reality live broadcast method and device, electronic equipment and readable storage medium Download PDF

Info

Publication number
CN114466202A
CN114466202A CN202011231625.0A CN202011231625A CN114466202A CN 114466202 A CN114466202 A CN 114466202A CN 202011231625 A CN202011231625 A CN 202011231625A CN 114466202 A CN114466202 A CN 114466202A
Authority
CN
China
Prior art keywords
live
video
mixed reality
virtual
live broadcast
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011231625.0A
Other languages
Chinese (zh)
Other versions
CN114466202B (en
Inventor
熊壮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Mobile Communications Group Co Ltd
China Mobile IoT Co Ltd
Original Assignee
China Mobile Communications Group Co Ltd
China Mobile IoT Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Mobile Communications Group Co Ltd, China Mobile IoT Co Ltd filed Critical China Mobile Communications Group Co Ltd
Priority to CN202011231625.0A priority Critical patent/CN114466202B/en
Publication of CN114466202A publication Critical patent/CN114466202A/en
Application granted granted Critical
Publication of CN114466202B publication Critical patent/CN114466202B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/23418Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • H04N21/234309Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by transcoding between formats or standards, e.g. from MPEG-2 to MPEG-4 or from Quicktime to Realvideo
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/44008Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440218Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by transcoding between formats or standards, e.g. from MPEG-2 to MPEG-4

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention provides a mixed reality live broadcast method, a mixed reality live broadcast device, electronic equipment and a readable storage medium, wherein the method comprises the following steps: collecting a live broadcast video of a live broadcast scene; rendering the live video to obtain a virtual video; and respectively transmitting the live video and the virtual video to a cloud server through a first slicing private network and a second slicing private network. The invention transmits different types of videos in live broadcast through different slice private networks, changes the essence of a transmission mode, improves the transmission rate, and simultaneously improves the entertainment of live broadcast interaction through a mixed reality live broadcast mode.

Description

Mixed reality live broadcast method and device, electronic equipment and readable storage medium
Technical Field
The present invention relates to the field of communications, and in particular, to a mixed reality live broadcasting method and apparatus, an electronic device, and a readable storage medium.
Background
Mixed Reality (MR) refers to a new visualization environment that results from merging real and virtual worlds. Physical and digital objects coexist in the new visualization environment and interact in real time.
Nowadays, with the development of electronic information technology and the continuous progress of computer technology, people have more and more abundant daily entertainment activities. The live broadcast is popular among the users due to rich content selectivity and interactivity.
The current live broadcast method generally generates Virtual Reality (VR) images only through a visual technology and carries communication through 5G, and the live broadcast mode is also suitable for 4G networks, the transmission mode is the same in nature, the transmission rate is low, and meanwhile, the problem of poor entertainment of live broadcast interaction exists.
Disclosure of Invention
The invention provides a mixed reality live broadcast method, a mixed reality live broadcast device, electronic equipment and a readable storage medium, which are used for solving the problems of low transmission rate and poor entertainment of live broadcast interaction caused by the fact that the nature of a live broadcast transmission mode is not changed.
According to a first aspect of the present invention, the present invention provides a mixed reality live broadcasting method applied to an electronic device, including: collecting a live broadcast video of a live broadcast scene; rendering the live video to obtain a virtual video; and respectively transmitting the live broadcast video and the virtual video to a cloud server through a first slicing private network and a second slicing private network.
In some embodiments, the step of rendering the live video to obtain the virtual video includes: analyzing each frame of live video of the live video; extracting a characteristic value of a motion track of an object in the live broadcast image; and obtaining the virtual video according to the characteristic value.
In some embodiments, the step of extracting the feature value of the motion trajectory of the object in the live image includes: dividing each frame of live image into grids to collect characteristic points; filtering invalid feature points; recording effective characteristic points of each frame; acquiring a motion track of the object based on time according to the effective characteristic points of each frame; and extracting the characteristic value based on the motion trail.
According to a second aspect of the present invention, the present invention provides a mixed reality live broadcasting method, applied to a server, including: receiving live broadcast videos and virtual videos which are respectively transmitted by electronic equipment at an input end through a first special slicing network and a second special slicing network; and transmitting the live video and the virtual video to an electronic device at an output end.
According to a third aspect of the present invention, the present invention provides a mixed reality live broadcasting method applied to an electronic device, including: receiving live videos and virtual videos transmitted by a cloud server; and displaying a live video picture or a virtual video picture according to the live video and the virtual video.
In some embodiments, the virtual screen comprises at least one of: a character model, an article model and a model of the electronic device user in the live view.
In some embodiments, the method further comprises: receiving a switching input for the live video picture or the virtual video picture; and switching between the live view and the virtual view in response to the switching input.
According to a fourth aspect of the present invention, the present invention provides a mixed reality live broadcasting device applied to an electronic device, including: the acquisition module is used for acquiring a live broadcast video of a live broadcast scene; the rendering module is used for rendering the live video to obtain a virtual video; and the first transmission module is used for respectively transmitting the live video and the virtual video to a cloud server through a first slicing private network and a second slicing private network.
According to a fifth aspect of the present invention, there is provided a mixed reality live broadcasting device applied to a server, comprising: the first receiving module is used for receiving live videos and virtual videos which are respectively transmitted by the electronic equipment at the input end through a first special slicing network and a second special slicing network; and the second transmission module is used for transmitting the live video and the virtual video to the electronic equipment at the output end.
According to a sixth aspect of the present invention, the present invention provides a mixed reality live broadcasting device applied to an electronic device, including: the second receiving module is used for receiving live videos and virtual videos transmitted by the cloud server; and the display module is used for displaying live video pictures or virtual video pictures according to the live video and the virtual video.
According to a seventh aspect of the present invention, there is provided an electronic device comprising a processor, a memory, and a program or instructions stored on the memory and executable on the processor, the program or instructions, when executed by the processor, implementing the steps of the mixed reality live method as described above.
According to an eighth aspect of the invention, there is provided a readable storage medium on which is stored a program or instructions which, when executed by a processor, implements a mixed reality live method as described above.
Compared with the prior art, the invention has the beneficial effects that: the invention transmits different types of videos in live broadcast through different slice private networks, changes the essence of a transmission mode, improves the transmission rate, and simultaneously improves the entertainment of live broadcast interaction through a mixed reality live broadcast mode.
Drawings
Fig. 1 is a schematic step flow diagram of a mixed reality live broadcasting method according to an embodiment of the present invention.
Fig. 2 is a schematic step flow diagram of another mixed reality live broadcasting method according to an embodiment of the present invention.
Fig. 3 is a schematic step flow diagram of another mixed reality live broadcasting method according to an embodiment of the present invention.
Fig. 4 is a schematic structural diagram of a mixed reality live broadcasting device according to an embodiment of the present invention.
Fig. 5 is a schematic structural diagram of another mixed reality live broadcasting device according to an embodiment of the present invention.
Fig. 6 is a schematic structural diagram of another mixed reality live device according to an embodiment of the present invention.
Fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention. It is to be understood that the described embodiments are merely exemplary of the invention, and not restrictive of the full scope of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
As shown in fig. 1, the present invention provides a mixed reality live broadcasting method applied to an electronic device, and the method includes steps S11 to S13.
And step S11, collecting the live video of the live scene.
In the embodiment of the present invention, optionally, a live video is obtained by shooting a live scene with a camera or a mobile phone. The live video of the live scene comprises the anchor and the background environment where the anchor is located. The anchor may be a character or an item.
And step S12, rendering the live video to obtain a virtual video.
In the embodiment of the present invention, optionally, the live video is rendered (for example, the live video is reconstructed by the image engine), virtual data of the character and the article is formed, and the virtual video is obtained according to the virtual data.
In some embodiments, optionally, step S12 specifically includes: step S121 to step S123.
And step S121, analyzing each frame of live broadcast image of the live broadcast video.
And S122, extracting a characteristic value of the motion trail of the object in the live broadcast image. Step S122 specifically includes steps S1221 to S1225.
And step S1221, dividing each frame of live broadcast image into grids to acquire feature points.
In the embodiment of the invention, each frame of live broadcast image is divided into a plurality of scales, wherein the scale is an image scale and can also be understood as a resolution, generally 8 spatial scales, and sampling on the plurality of spatial scales can ensure that the sampled feature points cover all spatial positions and scales. And densely collecting the characteristic points on each scale in a grid division mode.
Step S1222, filtering the invalid feature points.
In the embodiment of the invention, the invalid feature points are some feature points which lack variation and cannot be tracked, and feature points lower than a certain threshold value can be filtered by calculating the feature value of the pixel point autocorrelation matrix.
And step S1223, recording the effective characteristic points of each frame.
In the embodiment of the invention, the effective characteristic point of the t frame is marked as Pt=(xt,yt) If the frame is t +1, the effective characteristic point is Pt+1=(xt+1,yt+1)。
Step S1224, obtaining a time-based motion trajectory of the object according to the valid feature points of each frame.
In the embodiment of the invention, the position of a certain characteristic point on a plurality of continuous live images forms a track.
And step S1225, extracting the characteristic value based on the motion trail.
In the embodiment of the invention, the subsequent feature point extraction is carried out according to the motion track, and after the feature point is extracted, the feature value of the feature point is calculated.
And S123, obtaining the virtual video according to the characteristic value.
In the embodiment of the invention, a large number of motion tracks exist in a section of live video, each motion track corresponds to one group of characteristic values, and the virtual video can be obtained by performing characteristic coding on all the characteristic value groups.
And step S13, respectively transmitting the live video and the virtual video to a cloud server through a first slicing private network and a second slicing private network.
In the embodiment of the present invention, optionally, the first sliced private network and the second sliced private network are both 5G sliced private networks, so-called slicing is a mode of networking on demand, a plurality of virtual end-to-end networks are cut out from an infrastructure, and each network is logically isolated from an access network to a bearer network and then to a core network. Live video and virtual video are transmitted through different slice private networks respectively, the essence of a transmission mode is changed, and the transmission rate is improved. In addition, live video and virtual video are transmitted through a 5G base station and a core network.
As shown in fig. 2, the present invention provides another mixed reality live broadcasting method, which is applied to a server and includes steps S21 to S22.
And step S21, receiving live videos and virtual videos which are respectively transmitted by the electronic equipment at the input end through the first special slicing network and the second special slicing network.
And step S22, transmitting the live video and the virtual video to the electronic equipment at the output end.
In the embodiment of the invention, the server stores data of live videos and virtual videos, such as characters and virtual data of articles at the input end, and even character data of users at the output end. In addition, the server is not responsible for the distribution of live videos and virtual videos, and is also responsible for live watching functions.
As shown in fig. 3, the present invention provides another mixed reality live broadcasting method applied to an electronic device, where the method includes steps S31 to S32.
And step S31, receiving the live video and the virtual video transmitted by the cloud server.
In the embodiment of the invention, the live video and the virtual video can be directly received through a traditional network, and the live video and the virtual video can also be received through a slice private network.
And step S32, displaying a live video picture or a virtual video picture according to the live video and the virtual video.
In an embodiment of the present invention, the virtual screen includes at least one of: a character model, an article model and a model of the electronic device user in the live view.
In some embodiments, optionally, step S32 specifically includes step S321 to step S322.
Step S321, receiving a switching input for the live video picture or the virtual video picture.
Step S322, in response to the switching input, switching between the live view and the virtual view.
In the embodiment of the invention, the electronic equipment user watches the live broadcast video through the live broadcast picture, and performs virtual operation such as changing the equipment in the goods selling live broadcast through the virtual picture, thereby improving the entertainment of live broadcast interaction.
As shown in fig. 4, the present invention provides a mixed reality live broadcasting device, which includes a capture module 41, a rendering module 42, and a first transmission module 43.
The capture module 41 is configured to capture a live video of a live scene.
In the embodiment of the invention, live scenes are shot through a camera or a mobile phone and the like to obtain live videos. The live video of the live scene comprises the anchor and the background environment where the anchor is located. The anchor may be a character or an item.
The rendering module 42 is configured to render the live video to obtain a virtual video.
In the embodiment of the invention, the live video is rendered (for example, the live video is reconstructed in an image engine), virtual data of characters and articles is formed, and the virtual video is obtained according to the virtual data.
Rendering module 42 analyzes each frame of live video by analyzing the live image; then extracting a characteristic value of a motion track of an object in the live broadcast image; and obtaining the virtual video according to the characteristic value.
Specifically, the following is referred to for extracting the characteristic value of the motion trajectory of the object in the live image.
And dividing each frame of live image into grids to acquire feature points. Each frame of live image is divided into a plurality of scales, generally 8 spatial scales, and sampling on the plurality of spatial scales can ensure that the sampled feature points cover all spatial positions and scales. And densely collecting the characteristic points on each scale in a grid division mode.
And filtering invalid feature points. The invalid feature points are some feature points which lack changes and cannot be tracked, and feature points lower than a certain threshold value can be filtered by calculating the feature value of the pixel point autocorrelation matrix.
The valid feature points of each frame are recorded. Recording the effective characteristic point of the t frame as Pt=(xt,yt) If the frame is t +1, the effective characteristic point is Pt+1=(xt+1,yt+1)。
And acquiring the motion trail of the object based on time according to the effective characteristic points of each frame. The position of a certain characteristic point on a plurality of continuous live broadcast images forms a track.
And extracting the characteristic value based on the motion trail. And subsequent feature point extraction is carried out according to the motion trail, and after feature points are extracted, the feature values of the feature points are calculated. A large number of motion tracks exist in a section of live video, each motion track corresponds to one group of characteristic values, and characteristic coding is carried out on all characteristic value groups to obtain the virtual video.
The first transmission module 43 is configured to transmit the live video and the virtual video to the cloud server through a first slicing private network and a second slicing private network, respectively.
In the embodiment of the invention, the first slicing private network and the second slicing private network are both 5G slicing private networks, so-called slicing is a mode of networking according to needs, a plurality of virtual end-to-end networks are cut out from an infrastructure, and each network is logically isolated from an access network to a bearer network and then to a core network. Live video and virtual video are transmitted through different slice private networks respectively, the essence of a transmission mode is changed, and the transmission rate is improved. In addition, live video and virtual video are transmitted through a 5G base station and a core network.
As shown in fig. 5, the present invention provides another mixed reality live broadcasting apparatus, which includes a first receiving module 51 and a second transmitting module 52.
The first receiving module 51 is configured to receive a live video and a virtual video respectively transmitted by an electronic device at an input end through a first private network and a second private network.
The second transmission module 52 is configured to transmit the live video and the virtual video to an electronic device at an output end.
In the embodiment of the invention, the server stores data of live videos and virtual videos, such as characters and virtual data of articles at the input end, and even character data of users at the output end. In addition, the server is not responsible for the distribution of live videos and virtual videos, and is also responsible for live watching functions.
As shown in fig. 6, the present invention provides another mixed reality live device, which includes a second receiving module 61 and a display module 62.
The second receiving module 61 is configured to receive live videos and virtual videos transmitted by the cloud server.
In the embodiment of the invention, the live video and the virtual video can be directly received through a traditional network, and the live video and the virtual video can also be received through a slice private network.
The display module 62 is configured to display a live video picture or a virtual video picture according to the live video and the virtual video.
In an embodiment of the present invention, the virtual screen includes at least one of: a character model, an item model, and a model of the user of the electronic device in the live view.
The display module 62 is further configured to receive a switching input for the live video picture or the virtual video picture; switching between the live view and the virtual view in response to the switching input.
In the embodiment of the invention, the electronic equipment user watches the live broadcast video through the live broadcast picture, and performs virtual operation such as changing the equipment in the goods selling live broadcast through the virtual picture, thereby improving the entertainment of live broadcast interaction.
Referring to fig. 7, an embodiment of the present invention further provides an electronic device 700, where the electronic device 700 may be a mobile phone, a tablet, a computer, a server, and other devices. As shown in fig. 7, the electronic device 700 includes a processor 701, a memory 702. The processor 701 is electrically connected to the memory 702.
The processor 701 is a control center of the electronic device 700, connects various parts of the entire electronic device using various interfaces and lines, and performs various functions of the electronic device and processes data by running or loading an application program stored in the memory 702 and calling data stored in the memory 702, thereby integrally monitoring the electronic device.
In this embodiment, the electronic device 700 is provided with a plurality of memory partitions, where the plurality of memory partitions include a system partition and a target partition, and the processor 701 in the electronic device 700 loads instructions corresponding to processes of one or more application programs into the memory 702 according to the following steps, and the processor 701 runs the application program stored in the memory 702, so as to execute each process of the mixed reality live broadcast method embodiment shown in fig. 1 and achieve the same technical effect, or execute each process of the mixed reality live broadcast method embodiment shown in fig. 2 and achieve the same technical effect, or execute each process of the mixed reality live broadcast method embodiment shown in fig. 3 and achieve the same technical effect.
It will be understood by those skilled in the art that all or part of the steps of the methods of the above embodiments may be performed by instructions or by instructions controlling associated hardware, and the instructions may be stored in a computer readable storage medium and loaded and executed by a processor. To this end, the present invention provides a readable storage medium, in which a plurality of instructions are stored, where the instructions can be loaded by a processor to execute the steps in any one of the mixed reality live broadcast methods provided by the embodiments of the present invention.
Wherein the readable storage medium may include: read Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disks, and the like.
Since the instructions stored in the readable storage medium can execute the steps in any mixed reality live broadcasting method provided by the embodiment of the present invention, the beneficial effects that any mixed reality live broadcasting method provided by the embodiment of the present invention can achieve can be achieved, for details, see the foregoing embodiments, and are not described herein again. The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
The invention has the beneficial effects that: the invention transmits different types of videos in live broadcast through different slice private networks, changes the essence of a transmission mode, improves the transmission rate, and simultaneously improves the entertainment of live broadcast interaction through a mixed reality live broadcast mode.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
The mixed reality live broadcast method, the mixed reality live broadcast system, the readable storage medium and the electronic device provided by the embodiment of the invention are described in detail, a specific example is applied in the text to explain the principle and the implementation mode of the invention, and the description of the embodiment is only used for helping to understand the technical scheme and the core idea of the invention; those of ordinary skill in the art will understand that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (12)

1. A mixed reality live broadcasting method is applied to electronic equipment and comprises the following steps:
collecting a live broadcast video of a live broadcast scene;
rendering the live video to obtain a virtual video; and
and respectively transmitting the live broadcast video and the virtual video to a cloud server through a first slicing private network and a second slicing private network.
2. The mixed reality live method of claim 1, wherein in the step of rendering the live video to obtain virtual video, comprises:
analyzing each frame of live broadcast image of the live broadcast video;
extracting a characteristic value of a motion track of an object in the live broadcast image; and
and obtaining the virtual video according to the characteristic value.
3. The mixed reality live broadcasting method according to claim 2, wherein the step of extracting the feature value of the motion trajectory of the object in the live broadcast image includes:
dividing each frame of live image into grids to collect characteristic points;
filtering invalid feature points;
recording effective characteristic points of each frame;
acquiring a motion track of the object based on time according to the effective characteristic points of each frame; and
and extracting the characteristic value based on the motion trail.
4. A mixed reality live broadcast method is applied to a server and comprises the following steps:
receiving live broadcast videos and virtual videos which are respectively transmitted by electronic equipment at an input end through a first special slicing network and a second special slicing network; and
and transmitting the live video and the virtual video to the electronic equipment at the output end.
5. A mixed reality live broadcasting method is applied to electronic equipment and comprises the following steps:
receiving live videos and virtual videos transmitted by a cloud server; and
and displaying a live video picture or a virtual video picture according to the live video and the virtual video.
6. The mixed reality live method of claim 5, wherein the virtual screen comprises at least one of: a character model, an article model and a model of the electronic device user in the live view.
7. The mixed reality live method of claim 5, further comprising:
receiving a switching input for the live video picture or the virtual video picture; and
switching between the live view and the virtual view in response to the switching input.
8. The utility model provides a live device of mixed reality which characterized in that is applied to electronic equipment, includes:
the acquisition module is used for acquiring a live broadcast video of a live broadcast scene;
the rendering module is used for rendering the live video to obtain a virtual video; and
and the first transmission module is used for respectively transmitting the live broadcast video and the virtual video to the cloud server through a first slice private network and a second slice private network.
9. The utility model provides a live device of mixed reality which characterized in that is applied to the server, includes:
the first receiving module is used for receiving live videos and virtual videos which are respectively transmitted by the electronic equipment at the input end through a first special slicing network and a second special slicing network; and
and the second transmission module is used for transmitting the live video and the virtual video to the electronic equipment at the output end.
10. The utility model provides a live device of mixed reality which characterized in that is applied to electronic equipment, includes:
the second receiving module is used for receiving live videos and virtual videos transmitted by the cloud server; and
and the display module is used for displaying live video pictures or virtual video pictures according to the live video and the virtual video.
11. An electronic device comprising a processor, a memory, and a program or instructions stored on the memory and executable on the processor, the program or instructions, when executed by the processor, implementing the steps of the mixed reality live method of any one of claims 1 to 3, or the program or instructions, when executed by the processor, implementing the steps of the mixed reality live method of claim 4, or the program or instructions, when executed by the processor, implementing the steps of the mixed reality live method of any one of claims 5-7.
12. A readable storage medium, characterized in that it stores thereon a program or instructions which, when executed by a processor, implement the steps of the mixed reality live method of any one of claims 1 to 3, or which, when executed by the processor, implement the steps of the mixed reality live method of claim 4, or which, when executed by the processor, implement the steps of the mixed reality live method of any one of claims 5-7.
CN202011231625.0A 2020-11-06 2020-11-06 Mixed reality live broadcast method, apparatus, electronic device and readable storage medium Active CN114466202B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011231625.0A CN114466202B (en) 2020-11-06 2020-11-06 Mixed reality live broadcast method, apparatus, electronic device and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011231625.0A CN114466202B (en) 2020-11-06 2020-11-06 Mixed reality live broadcast method, apparatus, electronic device and readable storage medium

Publications (2)

Publication Number Publication Date
CN114466202A true CN114466202A (en) 2022-05-10
CN114466202B CN114466202B (en) 2023-12-12

Family

ID=81404717

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011231625.0A Active CN114466202B (en) 2020-11-06 2020-11-06 Mixed reality live broadcast method, apparatus, electronic device and readable storage medium

Country Status (1)

Country Link
CN (1) CN114466202B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115396688A (en) * 2022-10-31 2022-11-25 北京玩播互娱科技有限公司 Multi-person interactive network live broadcast method and system based on virtual scene

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB201109590D0 (en) * 2011-06-08 2011-07-20 Cubicspace Ltd System for viewing and interacting with a virtual 3-D scene
CN106231317A (en) * 2016-09-29 2016-12-14 三星电子(中国)研发中心 Video processing, coding/decoding method and device, VR terminal, audio/video player system
CN106303555A (en) * 2016-08-05 2017-01-04 深圳市豆娱科技有限公司 A kind of live broadcasting method based on mixed reality, device and system
CN106993195A (en) * 2017-03-24 2017-07-28 广州创幻数码科技有限公司 Virtual portrait role live broadcasting method and system
CN107845129A (en) * 2017-11-07 2018-03-27 深圳狗尾草智能科技有限公司 Three-dimensional reconstruction method and device, the method and device of augmented reality
CN108629301A (en) * 2018-04-24 2018-10-09 重庆大学 A kind of human motion recognition method based on moving boundaries dense sampling and movement gradient histogram
WO2019017579A1 (en) * 2017-07-21 2019-01-24 삼성전자주식회사 Display device, display method and display system
CN109963163A (en) * 2017-12-26 2019-07-02 阿里巴巴集团控股有限公司 Internet video live broadcasting method, device and electronic equipment
CN111200859A (en) * 2018-11-19 2020-05-26 华为技术有限公司 Network slice selection method, network equipment and terminal
US20200289934A1 (en) * 2019-03-15 2020-09-17 Sony Interactive Entertainment Inc. Methods and systems for spectating characters in virtual reality views

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB201109590D0 (en) * 2011-06-08 2011-07-20 Cubicspace Ltd System for viewing and interacting with a virtual 3-D scene
CN106303555A (en) * 2016-08-05 2017-01-04 深圳市豆娱科技有限公司 A kind of live broadcasting method based on mixed reality, device and system
CN106231317A (en) * 2016-09-29 2016-12-14 三星电子(中国)研发中心 Video processing, coding/decoding method and device, VR terminal, audio/video player system
CN106993195A (en) * 2017-03-24 2017-07-28 广州创幻数码科技有限公司 Virtual portrait role live broadcasting method and system
WO2019017579A1 (en) * 2017-07-21 2019-01-24 삼성전자주식회사 Display device, display method and display system
CN107845129A (en) * 2017-11-07 2018-03-27 深圳狗尾草智能科技有限公司 Three-dimensional reconstruction method and device, the method and device of augmented reality
CN109963163A (en) * 2017-12-26 2019-07-02 阿里巴巴集团控股有限公司 Internet video live broadcasting method, device and electronic equipment
CN108629301A (en) * 2018-04-24 2018-10-09 重庆大学 A kind of human motion recognition method based on moving boundaries dense sampling and movement gradient histogram
CN111200859A (en) * 2018-11-19 2020-05-26 华为技术有限公司 Network slice selection method, network equipment and terminal
US20200289934A1 (en) * 2019-03-15 2020-09-17 Sony Interactive Entertainment Inc. Methods and systems for spectating characters in virtual reality views

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
PETER GUSEV 等: "Data-Centric Video for Mixed Reality", 《2019 28TH INTERNATIONAL CONFERENCE ON COMPUTER COMMUNICATION AND NETWORKS (ICCCN)》 *
刘洁 等: "5G网络中的移动VR应用", 《电信科学》, vol. 34, no. 10 *
黄山松 等: "面向VR探视业务的5G SA医疗专网方案研究", 《电子技术应用》, no. 6 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115396688A (en) * 2022-10-31 2022-11-25 北京玩播互娱科技有限公司 Multi-person interactive network live broadcast method and system based on virtual scene
CN115396688B (en) * 2022-10-31 2022-12-27 北京玩播互娱科技有限公司 Multi-person interactive network live broadcast method and system based on virtual scene

Also Published As

Publication number Publication date
CN114466202B (en) 2023-12-12

Similar Documents

Publication Publication Date Title
CN107659825B (en) A kind of method, apparatus, server, main broadcaster end and medium that live video is retained
CN104519401B (en) Video segmentation point preparation method and equipment
CN108010037B (en) Image processing method, device and storage medium
CN105847718B (en) Live video barrage display methods based on scene Recognition and its display device
CN108632632B (en) Live webcast data processing method and device
CN108632676B (en) Image display method, image display device, storage medium and electronic device
CN110602554A (en) Cover image determining method, device and equipment
CN105871808A (en) Method and device for transcoding live video
CN105654471A (en) Augmented reality AR system applied to internet video live broadcast and method thereof
CN108635863B (en) Live webcast data processing method and device
CN104010179B (en) Multi-user clustering and viewpoint calculating system and method based on multiple three-dimensional pictures
CN106713942A (en) Video processing method and video processing device
CN110740290A (en) Monitoring video previewing method and device
CN113469200A (en) Data processing method and system, storage medium and computing device
CN111225287A (en) Bullet screen processing method and device, electronic equipment and storage medium
CN114466202B (en) Mixed reality live broadcast method, apparatus, electronic device and readable storage medium
CN204695223U (en) A kind of model display system based on augmented reality
CN113259764A (en) Video playing method, video playing device, electronic equipment and video playing system
CN112492231A (en) Remote interaction method, device, electronic equipment and computer readable storage medium
CN115396705A (en) Screen projection operation verification method, platform and system
CN105872537A (en) Video playing method, device and system
CN108881119A (en) A kind of methods, devices and systems of video concentration
CN105472271A (en) Video interaction method, device and system
CN116489424A (en) Live background generation method and device, electronic equipment and computer readable medium
CN113365130A (en) Live broadcast display method, live broadcast video acquisition method and related devices

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant