CN114554232A - Mixed reality live broadcast method and system based on naked eye 3D - Google Patents

Mixed reality live broadcast method and system based on naked eye 3D Download PDF

Info

Publication number
CN114554232A
CN114554232A CN202111677435.6A CN202111677435A CN114554232A CN 114554232 A CN114554232 A CN 114554232A CN 202111677435 A CN202111677435 A CN 202111677435A CN 114554232 A CN114554232 A CN 114554232A
Authority
CN
China
Prior art keywords
live broadcast
video stream
stream
naked eye
format
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111677435.6A
Other languages
Chinese (zh)
Other versions
CN114554232B (en
Inventor
张伟香
周一柯
钱振超
许广巍
方勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Evis Technology Co ltd
Original Assignee
Shanghai Evis Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Evis Technology Co ltd filed Critical Shanghai Evis Technology Co ltd
Priority to CN202111677435.6A priority Critical patent/CN114554232B/en
Publication of CN114554232A publication Critical patent/CN114554232A/en
Application granted granted Critical
Publication of CN114554232B publication Critical patent/CN114554232B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • H04N21/234309Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by transcoding between formats or standards, e.g. from MPEG-2 to MPEG-4 or from Quicktime to Realvideo
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/236Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

The invention discloses a mixed reality live broadcast method and a system based on naked eye 3D, wherein the method comprises the following steps: s1, collecting the video stream of the live broadcast site by the anchor push stream port; step S2, the anchor streaming end sends the collected video stream and the anchor interaction command to the live broadcast server; step S3, the live broadcast server processes the video stream; step S4, the audience pull end obtains the video stream and the interactive command from the live broadcast server; step S5, the audience pull stream end synthesizes the video stream and the content generated according to the interactive command and outputs the synthesized video stream and the content in a 3D format; and step S6, displaying the 3D format content on a naked eye 3D display device. The mixed reality live broadcasting method and system based on naked eye 3D provided by the invention can enable a user to experience a 3D effect without wearing any auxiliary equipment, and improve the immersion sense of the user when watching the live broadcasting.

Description

Mixed reality live broadcast method and system based on naked eye 3D
Technical Field
The invention belongs to the technical field of naked eye 3D, relates to a live broadcast method, and particularly relates to a mixed reality live broadcast method and system based on naked eye 3D.
Background
Live is one of the new media that has emerged and is popular in recent years and has become the most popular form of media presentation at present. Through the rapid development of years, the live broadcast content has gradually developed from professional fields such as news live broadcast and conference live broadcast to life entertainment and business fields such as dance live broadcast, game live broadcast, release meeting live broadcast and live broadcast with goods.
Live broadcast and technologies such as AR, MR have natural binding gene, and technologies such as AR, MR can give the energy for live broadcast, further draws the distance between spectator and the anchor, the distance between spectator and the virtual object, lets spectator experience with the anchor face to face conscientiously, lets spectator closely observe the virtual object, has greatly strengthened spectator's sense of immersion. However, due to the limitation of the current technology, AR glasses and MR glasses still bring about the bad experiences of constraint feeling, vertigo feeling and the like to users, and the application and popularization of the technologies such as AR glasses and MR glasses in the live broadcast industry are greatly hindered.
The anti-naked eye 3D technology can fundamentally solve the technical shortages of AR glasses and MR glasses, eliminates the constraint feeling and dizziness feeling of a user, enables the user to view the real 3D depth of field effect in real time without wearing any auxiliary equipment, and improves the user experience qualitatively. From this point of view, naked eye 3D technology is the best technical partner for the live broadcast industry.
In view of the above, there is an urgent need to design a new live broadcast method to overcome at least some of the above-mentioned disadvantages of the existing live broadcast method.
Disclosure of Invention
The invention provides a mixed reality live broadcast method and system based on naked eye 3D, which can enable a user to experience a 3D effect without wearing any auxiliary equipment and improve the immersion sense of the user when watching a live broadcast.
In order to solve the technical problem, according to one aspect of the present invention, the following technical solutions are adopted:
a mixed reality live broadcast method based on naked eye 3D, the method comprises the following steps:
s1, collecting the video stream of the live broadcast site by the anchor push stream port;
step S2, the anchor streaming end sends the collected video stream and the anchor interaction command to the live broadcast server;
step S3, the live broadcast server processes the video stream;
step S4, the audience pull end obtains the video stream and the interactive command from the live broadcast server;
step S5, the audience pull stream end synthesizes the video stream and the content generated according to the interactive command and outputs the synthesized video stream and the content in a 3D format;
and step S6, displaying the 3D format content on a naked eye 3D display device.
In step S1, the anchor streaming end performs recording acquisition on the live broadcast site through an acquisition device.
As an embodiment of the present invention, in step S3, the live broadcast server processes the video stream, including at least one of rate conversion, format conversion, watermark addition, and CDN acceleration.
In step S6, the screen rendered in the 3D format at the viewer stream end is output to a naked-eye 3D display device for display, and the viewer can view the 3D device without wearing any auxiliary device.
According to another aspect of the invention, the following technical scheme is adopted: a mixed reality live system based on naked eye 3D, the system comprising: the system comprises a live broadcast server, at least one main broadcast push stream terminal, at least one audience pull stream terminal and at least one naked eye 3D display device;
the live broadcast server is respectively connected with each anchor stream pushing end and each audience stream pulling end, and the audience stream pulling ends are connected with corresponding naked-eye 3D display equipment;
the anchor stream pushing end is used for acquiring a video stream of an anchor site and sending the acquired video stream and an anchor interaction command to a live broadcast server;
the live broadcast server is used for processing the video stream;
the audience stream pulling end is used for acquiring a video stream and an interactive command from a live broadcast server, synthesizing the video stream and the content generated according to the interactive command, and outputting the video stream and the content in a 3D format;
the naked eye 3D display equipment is used for displaying corresponding content according to the obtained 3D format content.
As an embodiment of the present invention, the anchor streaming end records and collects a live broadcast site through a video collection device.
As an embodiment of the present invention, the processing of the video stream by the live broadcast server includes at least one of rate conversion, format conversion, watermark addition, and CDN acceleration.
As an embodiment of the present invention, the viewer stream end outputs a picture rendered in a 3D format to a naked eye 3D display device; the naked eye 3D display device displays corresponding content according to the received 3D format content, and a viewer can view the 3D device without wearing any auxiliary device.
The invention has the beneficial effects that: the mixed reality live broadcasting method and system based on naked eye 3D provided by the invention can enable a user to experience a 3D effect without wearing any auxiliary equipment, and improve the immersion sense of the user when watching the live broadcasting.
Drawings
Fig. 1 is a flowchart of a mixed reality live broadcast method based on naked eye 3D in an embodiment of the present invention.
Fig. 2 is a schematic diagram of live video stream acquisition according to an embodiment of the present invention.
Fig. 3 is a schematic diagram of a main push stream and a viewer push stream according to an embodiment of the present invention.
Fig. 4 is a schematic diagram illustrating composition of a 2D video stream and authoring content according to an embodiment of the present invention.
Fig. 5 is a schematic diagram illustrating composition of a 3D video stream and authoring content according to an embodiment of the present invention.
Fig. 6 is a schematic diagram illustrating that the synthesized content is displayed on a naked-eye 3D display device in a 3D format according to an embodiment of the present invention.
Detailed Description
Preferred embodiments of the present invention will be described in detail below with reference to the accompanying drawings.
For a further understanding of the invention, reference will now be made to the preferred embodiments of the invention by way of example, and it is to be understood that the description is intended to further illustrate features and advantages of the invention, and not to limit the scope of the claims.
The description in this section is for several exemplary embodiments only, and the present invention is not limited only to the scope of the embodiments described. It is within the scope of the present disclosure and protection that the same or similar prior art means and some features of the embodiments may be interchanged.
The steps in the embodiments in the specification are only expressed for convenience of description, and the implementation manner of the present application is not limited by the order of implementation of the steps. The term "connected" in the specification includes both direct connection and indirect connection.
The invention discloses a mixed reality live broadcast method based on naked eye 3D, and FIG. 1 is a flow chart of the mixed reality live broadcast method based on naked eye 3D in one embodiment of the invention; referring to fig. 1, the method includes:
step S1, collecting the video stream of the live broadcast site by the anchor stream pushing end;
in an embodiment of the present invention, in step S1, the anchor streaming end performs recording acquisition on the live broadcast site through the acquisition device.
Step S2, the anchor push stream end sends the collected video stream and the anchor interaction command to the live broadcast server;
step S3, the live broadcast server processes the video stream;
in an embodiment of the present invention, in step S3, the live broadcast server processes the video stream, including at least one of rate conversion, format conversion, watermark addition, and CDN acceleration.
Step S4, the spectator pulling end obtains the video stream and the interactive command from the live broadcast server;
step S5, the spectator pulling end synthesizes the video stream with the content generated according to the interactive command and outputs the synthesized video stream in a 3D format;
step S6, the 3D format content is displayed on a naked eye 3D display device.
In an embodiment of the present invention, in step S6, the frame rendered in the 3D format at the viewer stream end is output to a naked-eye 3D display device for display, and the viewer can view the 3D device without wearing any auxiliary device.
The invention further discloses a mixed reality live broadcast system based on naked eye 3D, which comprises: the system comprises a live broadcast server 1, at least one main broadcast push stream terminal 2, at least one audience pull stream terminal 3 and at least one naked eye 3D display device 4.
The live broadcast server 1 is respectively connected with each main broadcast push stream end 2 and each audience pull stream end 3, and the audience pull stream ends 3 are connected with corresponding naked eye 3D display equipment 4.
The anchor streaming end 2 is used for acquiring a video stream of an anchor site and sending the acquired video stream and an anchor interaction command to a live broadcast server together.
In an embodiment of the present invention, the anchor streaming end 2 records and collects the live broadcast site through a video collection device (such as a monocular camera, a binocular camera, etc.).
In a usage scenario of the present invention, the main push streaming end acquires a video stream of a live broadcast site, and an acquisition device thereof may adopt common video recording acquisition devices including, but not limited to, a monocular camera, a binocular camera, and the like. The signal transmission form of the acquisition device can support various forms, namely, the network camera transmits the acquired video stream back through the network (including but not limited to HTTP, HTTPs, RTSP, RTMP, etc.); the camera can also be directly connected locally, for example, the computer host is directly connected through a USB interface, and software can directly access and acquire the picture of the camera; the camera can also be a video interface such as an HDMI interface or an SDI interface, and the picture of the camera is obtained by connecting a video acquisition card of a computer host.
The anchor streaming end software sends the collected video stream and an anchor interaction command to the live broadcast server together, which means that the live broadcast end software comprises two parts when pushing content to the live broadcast server, wherein one part is the video stream of a live broadcast scene picture collected on site through the collection equipment; the other part is an interactive command issued by the anchor through gestures, voice, or other means.
The live broadcast server 1 is used for processing video streams. In an embodiment of the present invention, the processing of the video stream by the live broadcast server 1 includes at least one of code rate conversion, format conversion, watermark addition, and CDN acceleration.
The audience stream pulling end 3 is used for acquiring the video stream and the interactive command from the live broadcast server, synthesizing the video stream and the content generated according to the interactive command, and outputting the synthesized content in a 3D format.
And the viewer pull stream end software synthesizes the video stream and the content generated according to the interactive command and outputs the video stream and the content in a 3D format. The content generated according to the interactive command refers to content generated by a content authoring tool, including but not limited to content (such as gift models, expressive special effects, and the like) produced by Unity3D, urea, C4D, Maya, and the like software, and then is displayed or not displayed according to the interactive command.
The 3D format includes, but is not limited to, a left-right 3D format, a top-bottom 3D format, a line-interleaved 3D format, and the like. The generation of the 3D format is usually to add binocular cameras in a scene (scene) in development software to simulate left and right eyes of human eyes, simulate a pupil distance of the human eyes at a distance between the binocular cameras, and render the images in the 3D format for image output through the binocular cameras.
The naked eye 3D display device 4 is used for displaying corresponding content according to the obtained 3D format content. In an embodiment of the present invention, the viewer stream terminal 3 outputs a picture rendered in a 3D format to the naked eye 3D display device 4; the naked eye 3D display device 4 displays corresponding content according to the received 3D format content, and a viewer can view the 3D device without wearing any auxiliary device.
The 3D format content is displayed on naked eye 3D display equipment, the synthesized content is displayed on the naked eye 3D display equipment in a naked eye 3D mode, and meanwhile the 3D format content has the 2D display compatible capability. The viewer can see the 3D effect without wearing any auxiliary device.
Fig. 2 is a schematic view of acquiring a video stream of a live broadcast site, where the live broadcast site may be acquired and recorded in real time by a video acquisition device, and the acquisition device may be a monocular camera for 2D recording or a binocular camera for 3D recording. The recorded video stream is input to a computer host in real time, and the computer host runs the required software.
When the live broadcast is collected, two modes can be provided, one mode is that a green curtain background is adopted in a live broadcast field, so that the subsequent phase is convenient to use a matting technology (namely a technology adopted in photography or movie shooting), and only the main part content (mainly a main broadcast) is reserved; in another way, the live broadcast site does not need a green screen, and only needs to be shot according to the general situation, and the actual background of the site is reserved (namely, as shown in fig. 2).
As shown in fig. 3, the application software running on the host computer at the anchor end pushes the collected video stream (2D or 3D video stream) and the interactive command of the anchor to the live broadcast server, and the transmission protocol used for pushing may be a protocol (or a proprietary protocol may be developed) commonly used in the industry, such as RTSP, RTMP, and HTTP. The live broadcast server can perform certain video image processing, such as code rate conversion, frame rate conversion, watermark addition and other operations, on the video stream according to the actual service requirements. After the audience software (namely client software) is connected with the live broadcast server through the network, the video stream and the interactive command of the anchor terminal are obtained. The transmission protocol used by the viewer software (i.e. the client software) may be RTSP, RTMP, HTTP and other protocols commonly used in the industry (proprietary protocols may also be developed).
As shown in fig. 4 and 5, the software running on the host computer at the viewer end (i.e., client end) synthesizes the captured and recorded video stream with the authored content. The authoring content is to display or hide a modeled 3D model, such as an interactive expression sent by a viewer in a live broadcast or a gift sent, according to an interactive command of a main broadcasting end. And 3D parallax can be optimized and adjusted during synthesis, so that the 3D effect of the whole content is ensured. After compositing, the output can be rendered in a 3D format, including but not limited to a left-right 3D format, a top-bottom 3D format, a line-interleaved 3D format, etc. (here, left-right 3D format is taken as an example).
The content synthesis method has different implementation details but the same basic principle according to different development software used for application development. As shown in fig. 4, the development based on the conventional Unity3D is described here as an example. During the development of application software, a video stream captured in the scene (scene) of Unity3D is shot as a Texture map, and a Unity object component Texture is given as a video playing form. Meanwhile, according to actual service needs, 3D models (compatible with 2D maps) such as interactive expressions and interactive gifts in live broadcast or other components required by any service are added to the scene (scene). Finally, two cameras are arranged in the scene (scene) to respectively simulate the left eye and the right eye of a person, the interpupillary distance between the cameras simulates the interpupillary distance of the person, and the two cameras in the scene (scene) respectively shoot left and right pictures, so that the left and right pictures are finally rendered into a left and right 3D format for display.
When the content is synthesized, the 2D mode and the 3D mode can be compatible at the same time. As shown in fig. 4, the composition operation is performed when the anchor terminal obtains the 2D video stream by using the 2D capture device, and at this time, only one Unity object component in the scene (scene) is used to display the video stream; as shown in fig. 5, the composition operation is performed when the anchor side obtains a 3D video stream (left and right 3D format, up and down 3D format, line interleave format, etc.) by using a 3D capture device, at this time, the 3D video stream needs to be firstly divided into two left and right video streams, then two completely overlapped Unity object components are used in a scene (scene) to respectively display the divided left and right video streams, and the Unity object component for displaying the left video stream is set to be visible only to a left scene camera, and the Unity object component for displaying the right video stream is set to be visible only to a right scene camera.
As shown in fig. 6, the synthesized content is output to a naked eye 3D display device for real-time display, the naked eye 3D device is responsible for performing algorithm processing on the input 3D format content and displaying the content in a naked eye 3D manner in real time, and a user can experience a 3D effect without wearing any auxiliary device in a use range of the naked eye 3D display device, so that the user has a good immersion feeling.
In conclusion, the mixed reality live broadcasting method and system based on naked eye 3D provided by the invention can enable the user to experience the 3D effect without wearing any auxiliary equipment, and improve the immersion sense of the user when watching the live broadcasting.
It should be noted that the present application may be implemented in software and/or a combination of software and hardware; for example, it may be implemented using Application Specific Integrated Circuits (ASICs), general purpose computers, or any other similar hardware devices. In some embodiments, the software programs of the present application may be executed by a processor to implement the above steps or functions. As such, the software programs (including associated data structures) of the present application can be stored in a computer-readable recording medium; such as RAM memory, magnetic or optical drives or diskettes, and the like. In addition, some steps or functions of the present application may be implemented using hardware; for example, as circuitry that cooperates with the processor to perform various steps or functions.
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The description and applications of the invention herein are illustrative and are not intended to limit the scope of the invention to the embodiments described above. Effects or advantages referred to in the embodiments may not be reflected in the embodiments due to interference of various factors, and the description of the effects or advantages is not intended to limit the embodiments. Variations and modifications of the embodiments disclosed herein are possible, and alternative and equivalent various components of the embodiments will be apparent to those skilled in the art. It will be clear to those skilled in the art that the present invention may be embodied in other forms, structures, arrangements, proportions, and with other components, materials, and parts, without departing from the spirit or essential characteristics thereof. Other variations and modifications of the embodiments disclosed herein may be made without departing from the scope and spirit of the invention.

Claims (8)

1. A mixed reality live broadcasting method based on naked eye 3D is characterized by comprising the following steps:
s1, collecting the video stream of the live broadcast site by the anchor push stream port;
step S2, the anchor streaming end sends the collected video stream and the anchor interaction command to the live broadcast server;
step S3, the live broadcast server processes the video stream;
step S4, the audience pull end obtains the video stream and the interactive command from the live broadcast server;
step S5, the audience pull stream end synthesizes the video stream and the content generated according to the interactive command and outputs the synthesized video stream and the content in a 3D format;
and step S6, displaying the 3D format content on a naked eye 3D display device.
2. The naked eye 3D-based mixed reality live broadcasting method according to claim 1, characterized in that:
in step S1, the anchor streaming end records and collects the live broadcast site through the collection device.
3. The naked eye 3D-based mixed reality live broadcasting method according to claim 1, characterized in that:
in step S3, the live broadcast server processes the video stream, including at least one of code rate conversion, format conversion, watermark addition, and CDN acceleration.
4. The naked eye 3D-based mixed reality live broadcasting method according to claim 1, characterized in that:
in step S6, the picture rendered in the 3D format at the viewer stream end is output to a naked-eye 3D display device for display, and the viewer can view the 3D device without wearing any auxiliary device.
5. A mixed reality live system based on bore hole 3D, characterized in that the system includes: the system comprises a live broadcast server, at least one main broadcast push stream terminal, at least one audience pull stream terminal and at least one naked eye 3D display device;
the live broadcast server is respectively connected with each anchor stream pushing end and each audience stream pulling end, and the audience stream pulling ends are connected with corresponding naked-eye 3D display equipment;
the anchor stream pushing end is used for acquiring a video stream of an anchor site and sending the acquired video stream and an anchor interaction command to a live broadcast server;
the live broadcast server is used for processing the video stream;
the audience stream pulling end is used for acquiring a video stream and an interactive command from a live broadcast server, synthesizing the video stream and the content generated according to the interactive command, and outputting the video stream and the content in a 3D format;
the naked eye 3D display equipment is used for displaying corresponding content according to the obtained 3D format content.
6. The naked eye 3D-based mixed reality live system according to claim 5, wherein:
and the main broadcasting stream pushing end records and collects the live broadcasting field through video collecting equipment.
7. The naked eye 3D-based mixed reality live system according to claim 5, wherein:
the live broadcast server processes the video stream and comprises at least one of code rate conversion, format conversion, watermark addition and CDN acceleration.
8. The naked eye 3D-based mixed reality live broadcasting system according to claim 5, wherein:
the audience pull stream end outputs a picture rendered in a 3D format to naked eye 3D display equipment; the naked eye 3D display device displays corresponding content according to the received 3D format content, and a viewer can view the 3D device without wearing any auxiliary device.
CN202111677435.6A 2021-12-31 2021-12-31 Naked eye 3D-based mixed reality live broadcast method and system Active CN114554232B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111677435.6A CN114554232B (en) 2021-12-31 2021-12-31 Naked eye 3D-based mixed reality live broadcast method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111677435.6A CN114554232B (en) 2021-12-31 2021-12-31 Naked eye 3D-based mixed reality live broadcast method and system

Publications (2)

Publication Number Publication Date
CN114554232A true CN114554232A (en) 2022-05-27
CN114554232B CN114554232B (en) 2024-05-31

Family

ID=81669268

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111677435.6A Active CN114554232B (en) 2021-12-31 2021-12-31 Naked eye 3D-based mixed reality live broadcast method and system

Country Status (1)

Country Link
CN (1) CN114554232B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107404644A (en) * 2017-07-27 2017-11-28 深圳依偎控股有限公司 It is a kind of based on the live display methods of double 3D for taking the photograph collection and system
CN108900928A (en) * 2018-07-26 2018-11-27 宁波视睿迪光电有限公司 Method and device, the 3D screen client, Streaming Media Cloud Server of naked eye 3D live streaming
CN110519611A (en) * 2019-08-23 2019-11-29 腾讯科技(深圳)有限公司 Living broadcast interactive method, apparatus, electronic equipment and storage medium
CN111683260A (en) * 2020-05-07 2020-09-18 广东康云科技有限公司 Program video generation method, system and storage medium based on virtual anchor

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107404644A (en) * 2017-07-27 2017-11-28 深圳依偎控股有限公司 It is a kind of based on the live display methods of double 3D for taking the photograph collection and system
CN108900928A (en) * 2018-07-26 2018-11-27 宁波视睿迪光电有限公司 Method and device, the 3D screen client, Streaming Media Cloud Server of naked eye 3D live streaming
CN110519611A (en) * 2019-08-23 2019-11-29 腾讯科技(深圳)有限公司 Living broadcast interactive method, apparatus, electronic equipment and storage medium
CN111683260A (en) * 2020-05-07 2020-09-18 广东康云科技有限公司 Program video generation method, system and storage medium based on virtual anchor

Also Published As

Publication number Publication date
CN114554232B (en) 2024-05-31

Similar Documents

Publication Publication Date Title
CN106789991B (en) Multi-person interactive network live broadcast method and system based on virtual scene
JP7135141B2 (en) Information processing system, information processing method, and information processing program
CN106792246B (en) Method and system for interaction of fusion type virtual scene
JP3789794B2 (en) Stereoscopic image processing method, apparatus, and system
WO2021135334A1 (en) Method and apparatus for processing live streaming content, and system
CN105939481A (en) Interactive three-dimensional virtual reality video program recorded broadcast and live broadcast method
CN106792228A (en) A kind of living broadcast interactive method and system
KR101791778B1 (en) Method of Service for Providing Advertisement Contents to Game Play Video
CN106303289A (en) A kind of real object and virtual scene are merged the method for display, Apparatus and system
CN113099204A (en) Remote live-action augmented reality method based on VR head-mounted display equipment
KR20150105058A (en) Mixed reality type virtual performance system using online
Zerman et al. User behaviour analysis of volumetric video in augmented reality
CN114286021B (en) Rendering method, rendering device, server, storage medium, and program product
KR100901111B1 (en) Live-Image Providing System Using Contents of 3D Virtual Space
CN109286760B (en) Entertainment video production method and terminal thereof
CN110730340A (en) Lens transformation-based virtual auditorium display method, system and storage medium
KR101752691B1 (en) Apparatus and method for providing virtual 3d contents animation where view selection is possible
KR101430985B1 (en) System and Method on Providing Multi-Dimensional Content
EP2590419A2 (en) Multi-depth adaptation for video content
CN117041608A (en) Data processing method and storage medium for linking on-line exhibition and off-line exhibition
CN109872400B (en) Panoramic virtual reality scene generation method
CN114554232B (en) Naked eye 3D-based mixed reality live broadcast method and system
JP6091850B2 (en) Telecommunications apparatus and telecommunications method
CN113259544B (en) Remote interactive holographic demonstration system and method
JP2020101847A (en) Image file generator, method for generating image file, image generator, method for generating image, image generation system, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant