CN116260988A - Remote autonomous VR (virtual reality) exhibition watching robot system based on VR technology and motion control - Google Patents

Remote autonomous VR (virtual reality) exhibition watching robot system based on VR technology and motion control Download PDF

Info

Publication number
CN116260988A
CN116260988A CN202310107281.XA CN202310107281A CN116260988A CN 116260988 A CN116260988 A CN 116260988A CN 202310107281 A CN202310107281 A CN 202310107281A CN 116260988 A CN116260988 A CN 116260988A
Authority
CN
China
Prior art keywords
module
robot
remote
viewing
subsystem
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310107281.XA
Other languages
Chinese (zh)
Inventor
谭强
马辰
姜荣
闫盼盼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong New Generation Information Industry Technology Research Institute Co Ltd
Original Assignee
Shandong New Generation Information Industry Technology Research Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong New Generation Information Industry Technology Research Institute Co Ltd filed Critical Shandong New Generation Information Industry Technology Research Institute Co Ltd
Priority to CN202310107281.XA priority Critical patent/CN116260988A/en
Publication of CN116260988A publication Critical patent/CN116260988A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K9/00Arc welding or cutting
    • B23K9/08Arrangements or circuits for magnetic control of the arc
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1689Teleoperation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • H04N21/234363Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by altering the spatial resolution, e.g. for clients with a lower screen resolution
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • H04N21/234381Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by altering the temporal resolution, e.g. decreasing the frame rate by frame skipping
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/258Client or end-user data management, e.g. managing client capabilities, user preferences or demographics, processing of multiple end-users preferences to derive collaborative data
    • H04N21/25866Management of end-user data
    • H04N21/25875Management of end-user data involving end-user authentication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/266Channel or content management, e.g. generation and management of keys and entitlement messages in a conditional access system, merging a VOD unicast channel into a multicast channel
    • H04N21/2662Controlling the complexity of the video stream, e.g. by scaling the resolution or bitrate of the video stream based on the client capabilities
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Abstract

A remote autonomous VR (virtual reality) exhibition watching robot system based on VR technology and motion control, a remote exhibition watching person remotely and independently controls the movement of a robot, the formed VR is remotely and physically separated, travel cost is saved, travel time is reduced, real-time live-action participation, immersion atmosphere and feeling can be met, real-time on-site efficient communication is achieved, live-broadcasting rhythm is independently controlled, and low-cost and multi-angle high-freedom VR live broadcasting is achieved.

Description

Remote autonomous VR (virtual reality) exhibition watching robot system based on VR technology and motion control
Technical Field
The invention relates to the technical field of robots, in particular to a remote autonomous VR viewing robot system based on VR technology and motion control.
Background
At present, the development of VR live broadcast lets the viewer can obtain the immersive live viewing experience remotely, but at present VR live broadcast mainly takes fixed point position live broadcast as the main, restricted VR live broadcast's viewing point position, but if a plurality of VR cameras are disposed at whole meeting place, realize that the live broadcast of multiple spot position can bring the very big improvement of cost. The prior art does not combine VR technology with robotics. If the robot is used as a carrier, the VR camera is matched, and the VR viewing robot with remote live-action presentation and real-time interaction capability is realized by combining the remote control and remote voice interaction capability of the robot.
Disclosure of Invention
In order to overcome the defects of the technology, the invention provides a system for realizing that a exhibitor can obtain immersive exhibition watching experience through terminals such as a VR head display and the like without going to an exhibition site.
The technical scheme adopted for overcoming the technical problems is as follows:
a remote autonomous VR viewing robot system based on VR technology and motion control, comprising:
the system comprises a robot subsystem, a VR live broadcast subsystem and a remote control subsystem which are arranged on a exhibition site;
the robot in the robot subsystem is provided with a VR camera, a SLAM unit and a communication unit, the robot scans and builds a map of a meeting and exhibition site based on the SLAM by utilizing the SLAM unit to obtain a map of the meeting and exhibition site, the position of the exhibition site is marked on the map according to the setting of the exhibition site, and the robot carries out autonomous navigation and autonomous obstacle avoidance according to the map and the point position;
the cloud server is connected with the communication unit in the robot subsystem in a network manner;
the remote display end consists of a VR head-mounted display which is connected with the cloud server through a network;
the VR live broadcast subsystem is composed of a video acquisition module arranged on the robot, a VR video processing module arranged on the cloud server and a VR video watching module arranged on the VR head-mounted display, panoramic videos shot by the VR cameras are spliced through the video acquisition module to finish panoramic video acquisition, the acquired panoramic videos are transmitted to the VR video processing module through a network, the VR video processing module carries out transcoding processing on the panoramic videos and then is distributed to the CDN for transmission, the transcoded panoramic videos are transmitted to the VR video watching module, and users watch through the VR head-mounted display;
the remote exhibition end carries out remote control on the robot through a remote control subsystem.
Further, the communication unit is a 5G CPE gateway, and the robot is connected with the cloud server through a 5G network.
Further, transcoding processing of the VR video processing module includes transcoding resolution, transcoding format, and transcoding streaming media encapsulation.
Further, the intelligent voice interaction system comprises a voice interaction subsystem, wherein the voice interaction subsystem is composed of an audio acquisition and play processing module and a voice intercom service module, the voice intercom service module is deployed in the cloud server, the play processing module is composed of a microphone and a loudspeaker, and the play processing module is arranged on the robot, the VR head-mounted display, the PC and the mobile phone.
Further, the system further comprises a background management subsystem, wherein the background management subsystem is composed of a login authentication module, an autonomous reservation module, a viewing parameter setting module, a live broadcast control module, a bill and balance management module and a viewing entrance generation and sharing module and is used for carrying out login identity verification through the login authentication module, selecting reservation of intent exhibition and viewing time through the autonomous reservation module, carrying out selection of whether live broadcast recording is started or not through the live broadcast control module, carrying out recharging, charging deduction and bill viewing operation through the bill and the balance management module, and setting live broadcast viewing passwords and live broadcast entrance link generation and sharing support through the viewing entrance generation and sharing module.
Furthermore, the remote watching end also comprises a PC or a mobile phone.
Further, the remote control subsystem is composed of a remote control interface module arranged at a remote display end, a message communication module arranged at a cloud server and a robot motion response module arranged on the robot, wherein the remote control interface module transmits control information to the cloud server through an integrated control SDK or call control API interface, the message communication module encrypts a message of the control information full link, the cloud server transmits the encrypted control information to the motion response module, and the motion response module controls the robot to execute remote control actions.
The beneficial effects of the invention are as follows: the remote exhibition watching person can remotely and independently control the movement of the robot to form VR remote physical body, so that travel cost is saved, travel time is reduced, real-time live-action participation, immersed atmosphere and feeling can be met, real-time on-site efficient communication is realized, live-broadcasting rhythm is independently controlled, and VR live broadcasting with low cost and multiple angles and high freedom degree is realized.
Drawings
FIG. 1 is a system block diagram of the present invention;
fig. 2 is a block diagram of a VR live subsystem of the present invention;
fig. 3 is a block diagram of a remote control subsystem of the present invention.
Detailed Description
The invention will be further described with reference to fig. 1, 2 and 3.
A remote autonomous VR viewing robot system based on VR technology and motion control, comprising: the system comprises a robot subsystem, a VR live broadcast subsystem and a remote control subsystem which are arranged on a exhibition site; the robot in the robot subsystem is provided with a VR camera, a SLAM unit and a communication unit, the robot scans and builds a map of a meeting and exhibition site based on the SLAM by utilizing the SLAM unit to obtain a map of the meeting and exhibition site, the position of the exhibition site is marked on the map according to the setting of the exhibition site, and the robot carries out autonomous navigation and autonomous obstacle avoidance according to the map and the point position; the cloud server is connected with the communication unit in the robot subsystem in a network manner; the remote display end consists of a VR head-mounted display which is connected with the cloud server through a network; the VR live broadcast subsystem is composed of a video acquisition module arranged on the robot, a VR video processing module arranged on the cloud server and a VR video watching module arranged on the VR head-mounted display, panoramic videos shot by the VR cameras are spliced through the video acquisition module, and panoramic video acquisition is completed. The VR panoramic camera is provided with a plurality of independent lenses, multiple independent videos are synchronously generated, so that multiple paths of videos are needed to be spliced and fused into a complete panoramic video, the splicing process can be carried out at an acquisition end or at a cloud end, when the video is spliced by the cloud end, the acquisition end respectively encodes and encapsulates multiple paths of original videos shot by the VR camera, the video is directly transmitted to a cloud platform through a 5G network, the cloud platform respectively de-encapsulates and decodes the injected multiple video streams, and after the original videos are restored, the original videos are spliced and encoded and encapsulated, a complete video stream is generated, and then subsequent transcoding, arrangement and distribution are carried out through efficient video encoding, compression and push stream transmission. The collected panoramic video is transmitted to a VR video processing module through a network, the VR video processing module carries out transcoding processing on the panoramic video and then distributes the panoramic video to a CDN for transmission, the transcoded panoramic video is transmitted to a VR video watching module, and a user watches through a VR head-mounted display; the remote display end performs remote control on the robot through a remote control subsystem, such as, but not limited to, forward movement, backward movement, left rotation, right rotation of a floor, lifting of a lifting rod and the like.
The on-site end of the exhibition uses the robot with autonomous navigation, autonomous obstacle avoidance and remote control as a chassis carrier, the robot carries a lifting rod with lifting capacity as a main supporting structure, the top end of the lifting rod carries a VR camera, and meanwhile, the robot also has an externally-placed loudspeaker and a microphone and can realize 5G network communication through the carried 5G route. The cloud uses cloud services to deploy related services. And the remote watching and displaying end uses VR head displays, PCs, mobile phones and the like to watch and interact with VR videos. The viewing function of the VR live broadcast subsystem, the remote control button function of the remote control subsystem and the real-time voice intercom of the voice interaction subsystem are integrated in live broadcast viewing terminal software. The viewing function of the VR live broadcast subsystem, the remote control button function of the remote control subsystem and the real-time voice intercom of the voice interaction subsystem are integrated in live broadcast viewing terminal software.
The remote exhibition watching person can remotely and autonomously control the movement of the robot, the formed VR is remotely and physically separated, the travel expense is saved, the travel time is reduced, meanwhile, real-time live-action participation, immersed atmosphere and feeling can be met, real-time on-site efficient communication is realized, the live-broadcast rhythm is autonomously controlled, and the VR live broadcast with low cost and multiple angles and high freedom degree is realized. Furthermore, the method can realize VR remote exhibition watching, VR remote investigation, VR remote visit, VR remote command, VR remote prison and the like in exhibition, factories, hospitals and other scenes with remote real-time live-action content presentation requirements.
Example 1:
the communication unit is a 5G CPE gateway, and the robot is connected with the cloud server through a 5G network.
Example 2:
transcoding processes of the VR video processing module include transcoding resolution (e.g., 8K to 4K), transcoding format (e.g., h.264 to h.265), and transcoding streaming media encapsulation (e.g., RTMP to HLS).
Example 3:
the intelligent voice interaction system comprises a cloud server, a voice interaction subsystem and a voice interaction subsystem, wherein the voice interaction subsystem is composed of an audio acquisition and play processing module and a voice intercom service module, the voice intercom service module is deployed in the cloud server, the play processing module is composed of a microphone and a loudspeaker, and the play processing module is arranged on a robot, a VR head-mounted display, a PC and a mobile phone.
Example 4:
the system comprises a login authentication module, an autonomous reservation module, a live broadcast control module, a bill and balance management module and a viewing entrance generation and sharing module, wherein the login authentication module is used for login identity verification, the autonomous reservation module is used for selecting reservation of intent exhibition and viewing time, the live broadcast control module is used for selecting whether to start live broadcast recording, the bill and balance management module are used for recharging, charging deduction and bill viewing operation, and the viewing entrance generation and sharing module is used for setting live broadcast viewing passwords, generating live broadcast entrance links and supporting sharing.
Example 5:
the remote watching end also comprises a PC or a mobile phone. The user side views the VR video content through the HMD or the VR glasses, views the VR video content with different viewing angles through head posture adjustment, and can also use traditional playing equipment, such as a computer, a mobile phone and the like, to view the VR video.
Example 6:
the remote control subsystem is composed of a remote control interface module arranged at a remote display end, a message communication module arranged at a cloud server and a robot motion response module arranged on the robot, wherein the remote control interface module transmits control information to the cloud server through an integrated control SDK or call control API interface, the message communication module encrypts a message of the control information full link, the cloud server transmits the encrypted control information to the motion response module, and the motion response module controls the robot to execute remote control actions.
Finally, it should be noted that: the foregoing description is only a preferred embodiment of the present invention, and the present invention is not limited thereto, but it is to be understood that modifications and equivalents of some of the technical features described in the foregoing embodiments may be made by those skilled in the art, although the present invention has been described in detail with reference to the foregoing embodiments. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (7)

1. Remote autonomous VR viewing robot system based on VR technology and motion control, comprising:
the system comprises a robot subsystem, a VR live broadcast subsystem and a remote control subsystem which are arranged on a exhibition site;
the robot in the robot subsystem is provided with a VR camera, a SLAM unit and a communication unit, the robot scans and builds a map of a meeting and exhibition site based on the SLAM by utilizing the SLAM unit to obtain a map of the meeting and exhibition site, the position of the exhibition site is marked on the map according to the setting of the exhibition site, and the robot carries out autonomous navigation and autonomous obstacle avoidance according to the map and the point position;
the cloud server is connected with the communication unit in the robot subsystem in a network manner;
the remote display end consists of a VR head-mounted display which is connected with the cloud server through a network;
the VR live broadcast subsystem is composed of a video acquisition module arranged on the robot, a VR video processing module arranged on the cloud server and a VR video watching module arranged on the VR head-mounted display, panoramic videos shot by the VR cameras are spliced through the video acquisition module to finish panoramic video acquisition, the acquired panoramic videos are transmitted to the VR video processing module through a network, the VR video processing module carries out transcoding processing on the panoramic videos and then is distributed to the CDN for transmission, the transcoded panoramic videos are transmitted to the VR video watching module, and users watch through the VR head-mounted display;
the remote exhibition end carries out remote control on the robot through a remote control subsystem.
2. The VR technology and motion control based remote autonomous VR viewing robot system of claim 1, wherein: the communication unit is a 5G CPE gateway, and the robot is connected with the cloud server through a 5G network.
3. The VR technology and motion control based remote autonomous VR viewing robot system of claim 1, wherein: transcoding processing of the VR video processing module includes transcoding resolution, transcoding format and transcoding media packaging.
4. The VR technology and motion control based remote autonomous VR viewing robot system of claim 1, wherein: the intelligent voice interaction system comprises a cloud server, a voice interaction subsystem and a voice interaction subsystem, wherein the voice interaction subsystem is composed of an audio acquisition and play processing module and a voice intercom service module, the voice intercom service module is deployed in the cloud server, the play processing module is composed of a microphone and a loudspeaker, and the play processing module is arranged on a robot, a VR head-mounted display, a PC and a mobile phone.
5. The VR technology and motion control based remote autonomous VR viewing robot system of claim 1, wherein: the system comprises a login authentication module, an autonomous reservation module, a live broadcast control module, a bill and balance management module and a viewing entrance generation and sharing module, wherein the login authentication module is used for login identity verification, the autonomous reservation module is used for selecting reservation of intent exhibition and viewing time, the live broadcast control module is used for selecting whether to start live broadcast recording, the bill and balance management module are used for recharging, charging deduction and bill viewing operation, and the viewing entrance generation and sharing module is used for setting live broadcast viewing passwords, generating live broadcast entrance links and supporting sharing.
6. The VR technology and motion control based remote autonomous VR viewing robot system of claim 1, wherein: the remote watching end also comprises a PC or a mobile phone.
7. The VR technology and motion control based remote autonomous VR viewing robot system of claim 1, wherein: the remote control subsystem is composed of a remote control interface module arranged at a remote display end, a message communication module arranged at a cloud server and a robot motion response module arranged on the robot, wherein the remote control interface module transmits control information to the cloud server through an integrated control SDK or call control API interface, the message communication module encrypts a message of the control information full link, the cloud server transmits the encrypted control information to the motion response module, and the motion response module controls the robot to execute remote control actions.
CN202310107281.XA 2023-02-14 2023-02-14 Remote autonomous VR (virtual reality) exhibition watching robot system based on VR technology and motion control Pending CN116260988A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310107281.XA CN116260988A (en) 2023-02-14 2023-02-14 Remote autonomous VR (virtual reality) exhibition watching robot system based on VR technology and motion control

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310107281.XA CN116260988A (en) 2023-02-14 2023-02-14 Remote autonomous VR (virtual reality) exhibition watching robot system based on VR technology and motion control

Publications (1)

Publication Number Publication Date
CN116260988A true CN116260988A (en) 2023-06-13

Family

ID=86685791

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310107281.XA Pending CN116260988A (en) 2023-02-14 2023-02-14 Remote autonomous VR (virtual reality) exhibition watching robot system based on VR technology and motion control

Country Status (1)

Country Link
CN (1) CN116260988A (en)

Similar Documents

Publication Publication Date Title
US10778951B2 (en) Camerawork generating method and video processing device
US8831780B2 (en) System and method for creating virtual presence
US11012657B2 (en) Method for processing overlay in 360-degree video system and apparatus for the same
KR102188270B1 (en) Method for processing 360-degree video data based on sub-picture and apparatus for the same
US10271082B2 (en) Video distribution method, video reception method, server, terminal apparatus, and video distribution system
KR102545195B1 (en) Method and apparatus for delivering and playbacking content in virtual reality system
KR102243664B1 (en) Method and apparatus for transceiving metadata for coordinate system of dynamic viewpoint
US10158685B1 (en) Viewing and participating at virtualized locations
US20200296350A1 (en) Method and device for transmitting and receiving metadata on coordinate system of dynamic viewpoint
US10897646B2 (en) Video stream transmission method and related device and system
US20160150267A1 (en) Apparatus, systems and methods for shared viewing experience using head mounted displays
WO2017190351A1 (en) Systems and methods for video processing and display
US10120440B2 (en) Virtual navigation system for virtual and real spaces
CN102450011A (en) Methods and apparatus for efficient streaming of free view point video
CN101002471A (en) Method and apparatus to encode image, and method and apparatus to decode image data
JPWO2017187821A1 (en) Information processing apparatus, information processing method, and three-dimensional image data transmission method
CN108293104A (en) Information processing system, wireless terminal and information processing method
CN106993177A (en) A kind of 720 degree of panorama acquisition systems of binocular
CN107105168A (en) Can virtual photograph shared viewing system
US20230011822A1 (en) System and method for social immersive content rendering
WO2021095573A1 (en) Information processing system, information processing method, and program
Taleb et al. Vr-based immersive service management in b5g mobile systems: A uav command and control use case
CN104093040A (en) Individuation cinema system, server operation method and client side operation method thereof
CN108769755A (en) High-resolution full view frequency live streaming camera system and method
US20230410443A1 (en) Method and device for rendering content in mobile communication system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination