WO2019092590A1 - Interaction d'utilisateurs dans un système de communication utilisant une notification par flux multipleы de données de réalité augmentée - Google Patents

Interaction d'utilisateurs dans un système de communication utilisant une notification par flux multipleы de données de réalité augmentée Download PDF

Info

Publication number
WO2019092590A1
WO2019092590A1 PCT/IB2018/058699 IB2018058699W WO2019092590A1 WO 2019092590 A1 WO2019092590 A1 WO 2019092590A1 IB 2018058699 W IB2018058699 W IB 2018058699W WO 2019092590 A1 WO2019092590 A1 WO 2019092590A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual object
user device
screen
displayed
data
Prior art date
Application number
PCT/IB2018/058699
Other languages
English (en)
Russian (ru)
Inventor
Михаил Павлович СУТОВСКИЙ
Original Assignee
ГИОРГАДЗЕ, Анико Тенгизовна
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ГИОРГАДЗЕ, Анико Тенгизовна filed Critical ГИОРГАДЗЕ, Анико Тенгизовна
Publication of WO2019092590A1 publication Critical patent/WO2019092590A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/08Volume rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display

Definitions

  • the invention relates to the field of communication technologies, in particular to a method of multiple streaming of data related to data on virtual objects of augmented reality, in real time.
  • the idea of the present invention is that the sender in real-time streaming data related to the location data captured by the camera of the user's device of the sender and displayed on the screen of the user's device of the sender, and to data about the sender-controlled primary virtual object displayed on the screen user device of the sender with reference to the specified location as an object of augmented reality sender.
  • the sender in real-time streaming data related to the location data captured by the camera of the user's device of the sender and displayed on the screen of the user's device of the sender, and to data about the sender-controlled primary virtual object displayed on the screen user device of the sender with reference to the specified location as an object of augmented reality sender.
  • At least one of the specified recipients can join the streaming of the sender’s data by performing its own streaming broadcast of data related to the other main virtual object.
  • this primary virtual object binds the recipient to a specified location, data about which is transmitted by streaming the sender's data and displayed on the screen of the recipient’s user device, after which the specified primary virtual object is displayed on the screens of user devices of both the sender and the recipient and is determined as an object of augmented reality sender.
  • the sender and receiver control only the actions of their virtual objects.
  • the remaining recipients of the streaming data of the sender, not participating in the joint, multi-streaming data see on the screens of their user devices virtual objects controlled by the sender and the receiver and displayed with reference to the specified locations of the location captured by the camera of the user device the sender and the data about which are transmitted by streaming the sender data.
  • FIG. 1 is a block diagram of an exemplary embodiment of a method for user interaction in a communication system according to one of the preferred embodiments of the present invention
  • FIG. 2a is a schematic diagram illustrating the screen of the user's device of the sender at the stage of selecting the main virtual object
  • FIG. 2b is a schematic diagram illustrating the screen of the user device of the sender, which displays the step of defining the location of the main virtual object
  • FIG. 1 is a block diagram of an exemplary embodiment of a method for user interaction in a communication system according to one of the preferred embodiments of the present invention
  • FIG. 2a is a schematic diagram illustrating the screen of the user's device of the sender at the stage of selecting the main virtual object
  • FIG. 2b is a schematic diagram illustrating the screen of the user device of the sender, which displays the step of defining the location of the main virtual object
  • FIG. 1 is a block diagram of an exemplary embodiment of a method for user interaction in a communication system according to one of the preferred embodiment
  • FIG. 2c is a schematic diagram illustrating the screen of the user's sender's device, on which a 3d animated virtual object is displayed as an object of augmented reality at a given location;
  • FIG. 2d is a schematic diagram illustrating the screen of the user's device of the sender at the stage of initiating the streaming of data;
  • FIG. 2e is a schematic diagram illustrating the screen of the recipient's user device at the stage of initiating receipt of the sender's streaming data;
  • FIG. 2f is a schematic diagram illustrating the screen of the recipient's user device displaying the sender's streaming data;
  • FIG. 2g is a schematic diagram illustrating the screen. the recipient's user device at the stage of initiating the start of streaming the recipient data;
  • FIG. 2h is a schematic diagram illustrating the screen of the user device of the recipient at the stage of selecting the main virtual object
  • FIG. 2i is a schematic diagram illustrating the recipient's user device screen, which displays the step of defining the location of the main virtual object
  • FIG. 2j is a schematic diagram illustrating the screen of the recipient user device at the stage of streaming the recipient data
  • FIG. 2k is a schematic diagram illustrating the screen of the user's sender device at the stage of streaming the recipient data
  • FIG. 3 is a schematic representation of an embodiment of a user device according to one of the preferred embodiments of the present invention
  • FIG. 4 is a schematic representation of an embodiment of a communication system in accordance with one of the preferred embodiments of the present invention.
  • the present invention relates to a method of user interaction in a communication system, a user device for organizing user interaction in a communication system, a communication system allowing to implement this method, and a computer-readable medium on which program instructions are stored, initiating aspects of a user interaction method according to the present invention.
  • a method for user interaction in a communication system including real-time streaming of sender data relating to location data displayed on a user's sender device screen using an image capture device of a sender user device, as well as data on at least one primary virtual object controlled by the sender and displayed on the screen
  • the user device of the sender with reference to the specified location displayed on the screen of the user device of the sender as an object of augmented reality, displaying in real time the streaming data of the sender on the screen of the user device at least one receiver, real-time streaming of data by at least at least one recipient relating to data about at least one primary virtual object controlled by the recipient m and displayed on the screen of the recipient user devices with reference to a specific location displayed on the screen of the recipient user device, as an object of augmented reality, together with the streaming data of the sender.
  • a user device for organizing user interaction in a communication system including an image capture device, at least one processor, a machine-readable medium connected to at least one processor and containing program instructions for user interaction in the communication system, which, when executed by at least one processor, ensures that real-time streaming of data related to location data displayed on the screen of the user device using the image capture device of the user device, as well as data on at least one monitored main virtual object displayed on the screen of the user device with reference to specified location displayed on the screen of the user device as an object of augmented reality, providing the ability to display in Real-time streaming data on the screen of the user device, enabling real-time streaming of data related to data on at least one controlled primary virtual object displayed on the screen of the user device with reference to the specified location displayed on the screen user device, as an object of augmented reality, together with streaming data that is displayed.
  • a communication system that allows user devices and a server to communicate with each other, containing software instructions located on a machine-readable media of a user device, which, when executed by at least one processor of the user device, ensures that Real-time streaming of data related to location data and displayed on the screen of the user device using the device capturing the image of the user device, as well as data on at least one controlled main virtual object displayed on the screen of the user device with reference to the specified location displayed on the screen of the user device as an object of augmented reality , providing the ability to display real-time streaming data on the screen of a user device;
  • a computer-readable media contains software instructions for user interaction in a communication system, which, when executed by at least one processor of a user device, provides the possibility of real-time streaming of data related to data about the location displayed on the screen of the user device using an image capture device user device, as well as data on at least one controlled primary virtual object displayed on the user device screen with reference to a specified location displayed on the user device screen, as an augmented reality object, providing the ability to display live streaming data on user device screen, enabling real-time streaming of data, relating to hsya to data about at least one controlled main virtual object displayed on the screen of the user device with reference to the specified location displayed on the screen of the user device as an object of augmented reality, together with streaming data that is displayed.
  • the sender has the ability to real-time streaming of data related to data on its location captured by the camera of its user device, and to data on the main virtual object displayed on the screen of its user device with reference to the specified location as an object augmented the sender's reality.
  • the latter is connected to the streaming of the sender’s data by streaming the data related to another main virtual object.
  • This primary recipient virtual object is displayed on the user device screens of both the sender and the recipient and is defined as the sender's augmented reality object.
  • the sender and receiver control only the actions of their primary virtual objects.
  • the above features provide the user with a new type and format of communication, affect the emotional atmosphere that is created between users, increase the interest of users in the implementation of this interaction.
  • User devices and a server in a communication system are communicated via a network through which connections between the server and user devices are established to enable user interaction in the communication system according to the described method, including without limitation the Internet, wireless communication networks, networks using standard communication technologies and / or protocols.
  • the described communication system can function on any suitable user devices, regardless of the operating systems installed on them.
  • User access to the communication system can be carried out using the appropriate application installed on the user device via the network.
  • An application is a program installed on the user's device and intended for user interaction in the communication system.
  • a user device for example, a smartphone, tablet computer, augmented reality glasses, or any other device that contains an image capture device capable of displaying the world around the user (for example, a camera), a display component capable of providing the user with the ability to see the displayed an image capturing world around you (for example, a user device screen) and a network component that allows you to communicate with at least at least one other user device.
  • an image capture device capable of displaying the world around the user (for example, a camera)
  • a display component capable of providing the user with the ability to see the displayed an image capturing world around you (for example, a user device screen)
  • a network component that allows you to communicate with at least at least one other user device.
  • Such devices must have a computing capacity and components sufficient to run and execution of applications based on the current location, as well as for streaming data.
  • the sender should be understood as a user of a communication system that streams data in real time, and as a receiver — a user who receives streaming data in real time.
  • the sender's streaming data contains data about the sender's real-world objects displayed on the screen of the sender's user device using the image capture device of the sender's user device, as well as data on at least one virtual object controlled by the sender and displayed on the screen of the user's sender as augmented object the sender's reality. It is obvious that the recipient, who decides to stream the recipient data together with the streaming data of the sender, automatically becomes the sender.
  • the recipient should be understood as the recipient who becomes the "matchmaker".
  • the streaming data of such a recipient which streams the data together with the sender, contains data about at least one primary virtual object controlled by the recipient and displayed on the screen of the recipient's user device as the sender's augmented reality object.
  • the recipient may receive a notification to his user device about the sender’s streaming the data and the ability to receive the specified data.
  • Such notification may also be a publication in the message stream of interacting users with the possibility of the recipient to open the publication with the further implementation of the steps of the method according to the present invention.
  • the sender and the at least one recipient prior to streaming, perform at least one of the selection of the primary virtual object among the available virtual objects through the user interface on the screen of the user device and the placement of the main virtual object on the screen of the user device with reference to the specified location, displayed on the screen of the user device, with its display as an object of augmented reality tpravitelya.
  • the sender and the at least one recipient perform at least one of combining the displayed primary virtual object with the additional virtual entity, setting a new location on the screen of the user device, and then moving the displayed primary virtual object on the screen of the user device at a given location, replacements are displayed the primary virtual object on the user device screen to another primary virtual object available for selection of the main virtual objects via the user interface on the user device screen.
  • both the sender and the recipient during the real-time streaming of data control their own primary virtual objects independently of each other.
  • This control includes a replacement. the displayed primary virtual object to another primary virtual object from the list of primary virtual objects available for replacement, as well as changing the location of the primary virtual object on the screen of the user device.
  • each of the users streaming data can combine their main virtual object with additional virtual objects.
  • At least one of the 2d static or animated virtual object, 3d static or animated virtual object is used as the primary virtual object.
  • a primary virtual object with an additional one can be implemented as allowing the sender and / or receiver to attach an additional virtual object to the primary virtual object.
  • the additional virtual object can be attached to the main virtual object at any time: at the initial specified location of the primary virtual object on the screen of the user's sender device, when the main virtual object moves, at the new specified location of the primary virtual object.
  • Additional virtual objects are also displayed on the screens of user devices of the sender and receiver as objects of augmented reality of the sender.
  • the additional virtual objects are mapped to the main virtual objects.
  • the location setting is provided by binding the main virtual object to a geographic location displayed by the image capturing device of the user's sender device, for example, by binding the specified object to the table surface displayed on the screen of the user device by the camera of the user's sender device. This applies to both the setting of the location by the sender and the setting of the location by the recipient.
  • the main virtual object is displayed, and subsequent movements of the main virtual object, which is, for example, a 3d animated virtual object, are made from the specified target location.
  • At least one of the interacting users both the one that streams the data and the one that receives the streaming data without streaming the data, is allowed to perform at least one of the following operations: sending a text message, sending a voice message, sending a video message, sending a multimedia message Ia, a video call.
  • the sender and / or at least one recipient of the main virtual object can be displayed on the screens of user devices from different angles and from different viewpoints. This means that after displaying at least one primary virtual object on the screens of user devices of the sender and / or recipient, the user can view the primary virtual object using an image capture device. your user device from different viewpoints, for example, from the top, side, or go around it.
  • FIG. 1 illustrates a block diagram of an exemplary embodiment of a method for user interaction in a communication system according to one of the preferred embodiments of the present invention. The steps of the method illustrated in the flowchart will be further described in more detail with reference to FIG. 2a - 2k.
  • the sender’s real-world objects 204 are displayed on the screen 200 of the user device 202 of the sender using the image capture device of the sender’s user device 202 and the panel 206 with basic virtual objects available for selection, as illustrated in FIG. 2a
  • the sender selects the primary virtual object 208 of the basic virtual objects available for selection. This step is described in block 100 of the flowchart shown in FIG. one.
  • the sender places the selected primary virtual object 208 on the screen 200 of the user device 202 of the sender. For this, the sender sets the location 210 of the primary virtual object 208, as illustrated in FIG. 2b.
  • the specified location 210 is displayed by the image capturing device of the user device 202 of the sender on the screen 200 of the user device 202 of the sender.
  • the sender can specify the location of an object, for example, by touching a specific location on the touchscreen 200 or dragging the selected primary virtual object 208 on the touchscreen 200 to a specific location, after which the specified object 208 is attached to a given geographical location displayed by the user’s device sender device 202, and is displayed as an augmented reality object, as illustrated in FIG. 2c. Those.
  • both the primary virtual object 208 and the objects 204 in the real world of the sender are displayed, which the image capture device of the user device 202 of the sender is facing.
  • the sender can inspect the displayed primary virtual object 208 from different sides through the screen 200 of the user device 202 of the sender, changing the viewing angle using the image capturing device of the user device 202 of the sender.
  • the primary virtual object 208 is a 3d animated virtual object. This stage of placing the main virtual object 208 is described in block 102 of the flowchart shown in FIG. one
  • the sender may in any sequence, for example, combine the displayed primary virtual object 208 with an additional virtual object (for example, text, 2d or 3d virtual object, audio or video); set a new location on the screen 200 of the user device 202 of the sender, as described above, and then moving the displayed primary virtual object 208 on the screen 200 of the user device 202 of the sender to a new specified location; replace the displayed primary virtual object 208 on the screen 200 of the user device 202 of the sender with another primary virtual object available for selection of the basic virtual objects through the user interface on the screen 200 of the user device 202 sender, etc.
  • an additional virtual object for example, text, 2d or 3d virtual object, audio or video
  • Combining the primary virtual object 208 with additional virtual objects can be accomplished, for example, by selecting the appropriate additional virtual object from files stored on the sender's user device 202, for example, from photos, or by selecting from selectable examples that drop down after touching with the finger of the corresponding window that says “Click to add an additional virtual object”.
  • the sender can perform any available actions on his displayed primary virtual object (primary virtual object 208 or a new primary virtual object with which the primary virtual object 208 can be replaced), i.e. manage your primary virtual object.
  • real-time data will be streamed relating to the data about the sender’s real-world objects 204, displayed using the image capture device of the sender’s user device 202, and data relating to the sender’s virtual objects data displayed on the screen 200 devices 202, as objects of the augmented reality of the sender (the main and additional virtual objects of the sender), i.e., for example, data related to finding the of the virtual object 208 at a predetermined location, displaying additional virtual objects in association with the main virtual object 208, with the movement of the ground 208 of the virtual object.
  • objects of the augmented reality of the sender the main and additional virtual objects of the sender
  • the recipient is provided with the option of receiving the sender's streaming data via the corresponding icon 214, as illustrated in FIG. 2e, the touch of which initiates the display of the streaming data of the sender on the screen 216 of the recipient's user device 218.
  • the sender's streaming data is displayed relating to the sender’s real-world objects 204 and to the sender’s main virtual object 208, as illustrated in FIG. 2f. This step of displaying streaming data is described in block 106 of the flowchart shown in FIG.
  • the recipient can observe what is happening on the screen. 216 user device 218 of the recipient and at a certain point in time to decide that he wants to stream data (receiver data) together with streaming data of the sender.
  • the recipient is provided with the option of implementing their own streaming of data through the corresponding icon 220, as illustrated in FIG. 2g, the touch of which initiates the start of streaming the sender's data.
  • the recipient selects the primary virtual object 222 on the screen 216 of the recipient's user device 218, as illustrated in FIG. 2h.
  • the location 224 is set on the screen 216 of the recipient user device 218 displaying the streaming data of the sender, as illustrated in FIG. 2i, for further attaching the recipient primary virtual object 222, as illustrated in FIG. 2j.
  • the recipient's actions performed to set the location of the primary virtual recipient object on the screen of the recipient's user device are similar to the actions of the sender when he sets the location of the primary virtual object of the sender.
  • the selected primary virtual recipient object 222 is displayed as the sender's augmented reality object. This step of streaming the recipient data is described in block 108 of the flowchart shown in FIG. one.
  • the recipient and the sender now display the data of multiple streaming, related to the streaming data of the recipient (primary virtual object 222 of the recipient) and to the streaming data of the sender (primary virtual object 208 of the sender and objects 204 of the real world of the sender), as illustrated in FIG. 2j and 2k respectively.
  • the recipient can perform any available actions on their primary virtual object similar to the available actions of the sender in relation to their primary virtual object.
  • Each of the users streaming real-time data controls its own primary virtual object independently of the other user.
  • Both the sender and the recipient can view the displayed main virtual objects 208 and 222 from different sides through the screens of their own user devices, changing the viewing angle using the image capturing device of the user device.
  • FIG. 3 is a schematic representation of an embodiment of a user device according to one embodiment of the present invention.
  • the specified user device can be both the user device of the sender and the user device of the recipient and contains the processor 300 and its associated screen 302, computer-readable media 304, network component 306 and image pickup device 308.
  • FIG. 4 shows a schematic representation of an example implementation of a communication system in accordance with one embodiment of the present invention.
  • the specified communication system contains the server 400 and its associated sender and recipient user devices 402 and 404, respectively.
  • the streaming of data from the sender's user device is carried out by the server 400.
  • the user interaction method in the communication system, the user device for organizing user interaction in the communication system, the communication system allowing to implement this method, and computer-readable media on which program instructions are stored that initiate the execution of aspects of the user interaction method according to the present invention is not limited to the specific features or steps described above. On the contrary, the specific features and steps described above are disclosed as examples implementing the present invention, and other equivalent features and steps may be covered by the scope of the present invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Graphics (AREA)
  • Human Computer Interaction (AREA)
  • Information Transfer Between Computers (AREA)
  • Processing Or Creating Images (AREA)

Abstract

L'invention concerne un procédé d'interaction d'utilisateurs dans un système de communication, consistant à effectuer en temps réel une notification de flux de données d'expéditeur concernant des données d'un emplacement représenté sur un écran d'un dispositif d'utilisateur d'expéditeur à l'aide d'un dispositif de prise d'image du dispositif d'utilisateur de l'expéditeur, ainsi que des données concernant au moins un objet virtuel principal commandé par l'expéditeur et représenté sur l'écran du dispositif d'utilisateur de l'expéditeur avec un lien vers l'emplacement donné représenté sur l'écran du distopisitif d'utilisateur de l'expéditeur en qualité d'objet de réalité augmentée, à représenter en temps réel des données de notification de flux de l'expéditeur sur l'écran du dispositif d'utilisateur d'au moins destinataire, à effectuer en temps réel une notification de flux de données d'au moins un destinataire se rapportant à des données concernant au moins un objet virtuel principal commandé par le destinataire et représenté sur l'écran du dispositif d'utilisateur du destinataire avec un lien vers un emplacement donné représenté sur l'écran du dispositif d'utilisateur du destinataire comme objet de réalité augmentée en même temps qu'une notification de flux de données de destinataire.
PCT/IB2018/058699 2017-11-09 2018-11-06 Interaction d'utilisateurs dans un système de communication utilisant une notification par flux multipleы de données de réalité augmentée WO2019092590A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201762583506P 2017-11-09 2017-11-09
US62/583,506 2017-11-09

Publications (1)

Publication Number Publication Date
WO2019092590A1 true WO2019092590A1 (fr) 2019-05-16

Family

ID=66438310

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2018/058699 WO2019092590A1 (fr) 2017-11-09 2018-11-06 Interaction d'utilisateurs dans un système de communication utilisant une notification par flux multipleы de données de réalité augmentée

Country Status (1)

Country Link
WO (1) WO2019092590A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110149332A (zh) * 2019-05-22 2019-08-20 北京达佳互联信息技术有限公司 直播方法、装置、设备及存储介质
CN113423017A (zh) * 2021-06-21 2021-09-21 腾讯科技(深圳)有限公司 直播画面显示方法、装置、计算机设备及存储介质
WO2022121592A1 (fr) * 2020-12-11 2022-06-16 北京字跳网络技术有限公司 Appareil et procédé d'interaction de diffusion continue en direct
CN115243096A (zh) * 2022-07-27 2022-10-25 北京字跳网络技术有限公司 直播间展示方法、装置、电子设备及存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130293584A1 (en) * 2011-12-20 2013-11-07 Glen J. Anderson User-to-user communication enhancement with augmented reality
US20140002442A1 (en) * 2012-06-29 2014-01-02 Mathew J. Lamb Mechanism to give holographic objects saliency in multiple spaces
US20140282162A1 (en) * 2013-03-15 2014-09-18 Elwha Llc Cross-reality select, drag, and drop for augmented reality systems
US20140306866A1 (en) * 2013-03-11 2014-10-16 Magic Leap, Inc. System and method for augmented and virtual reality
US20160307374A1 (en) * 2013-12-19 2016-10-20 Metaio Gmbh Method and system for providing information associated with a view of a real environment superimposed with a virtual object
US20170178272A1 (en) * 2015-12-16 2017-06-22 WorldViz LLC Multi-user virtual reality processing

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130293584A1 (en) * 2011-12-20 2013-11-07 Glen J. Anderson User-to-user communication enhancement with augmented reality
US20140002442A1 (en) * 2012-06-29 2014-01-02 Mathew J. Lamb Mechanism to give holographic objects saliency in multiple spaces
US20140306866A1 (en) * 2013-03-11 2014-10-16 Magic Leap, Inc. System and method for augmented and virtual reality
US20140282162A1 (en) * 2013-03-15 2014-09-18 Elwha Llc Cross-reality select, drag, and drop for augmented reality systems
US20160307374A1 (en) * 2013-12-19 2016-10-20 Metaio Gmbh Method and system for providing information associated with a view of a real environment superimposed with a virtual object
US20170178272A1 (en) * 2015-12-16 2017-06-22 WorldViz LLC Multi-user virtual reality processing

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110149332A (zh) * 2019-05-22 2019-08-20 北京达佳互联信息技术有限公司 直播方法、装置、设备及存储介质
CN110149332B (zh) * 2019-05-22 2022-04-22 北京达佳互联信息技术有限公司 直播方法、装置、设备及存储介质
WO2022121592A1 (fr) * 2020-12-11 2022-06-16 北京字跳网络技术有限公司 Appareil et procédé d'interaction de diffusion continue en direct
CN113423017A (zh) * 2021-06-21 2021-09-21 腾讯科技(深圳)有限公司 直播画面显示方法、装置、计算机设备及存储介质
CN115243096A (zh) * 2022-07-27 2022-10-25 北京字跳网络技术有限公司 直播间展示方法、装置、电子设备及存储介质

Similar Documents

Publication Publication Date Title
US10356363B2 (en) System and method for interactive video conferencing
US9787945B2 (en) System and method for interactive video conferencing
US9615058B2 (en) Apparatus and method for sharing content items among a plurality of mobile devices
US11979244B2 (en) Configuring 360-degree video within a virtual conferencing system
WO2019092590A1 (fr) Interaction d'utilisateurs dans un système de communication utilisant une notification par flux multipleы de données de réalité augmentée
US10754526B2 (en) Interactive viewing system
US20160227115A1 (en) System for digital media capture
US10986301B1 (en) Participant overlay and audio placement collaboration system platform and method for overlaying representations of participants collaborating by way of a user interface and representational placement of distinct audio sources as isolated participants
WO2019072096A1 (fr) Procédé interactif, dispositif, système et support d'informations lisible par ordinateur dans une diffusion continue de vidéos en direct
EP2685715A1 (fr) Procédé et dispositif de gestion de ressources vidéo en visioconférence
KR20170091913A (ko) 영상 서비스 제공 방법 및 장치
CN115509398A (zh) 使用即时消息服务以显示图释的方法及其用户装置
US20200201512A1 (en) Interactive editing system
WO2017035368A1 (fr) Système et procédé de vidéoconférences interactives
WO2019056001A1 (fr) Système et procédé de vidéoconférence interactive
US10942633B2 (en) Interactive viewing and editing system
WO2019082050A1 (fr) Interaction d'utilisateurs dans un système de communications utilisant de l'historique et messages de réalité augmentée
WO2019087014A1 (fr) Interaction des utilisateurs via une diffusion en continu de données de réalité augmentée
CN106162234A (zh) 一种分享电视节目的方法及装置
US20230300180A1 (en) Remote realtime interactive network conferencing
US20220122037A1 (en) Meeting and collaborative canvas with image pointer
CN111586465B (zh) 直播间的操作交互方法、装置、设备及存储介质
US20230113024A1 (en) Configuring broadcast media quality within a virtual conferencing system
WO2019064160A1 (fr) Interaction d'utilisateurs dans un système de communications utilisant des objets de réalité augmentée
WO2019106558A1 (fr) Interaction d'utilisateurs dans un système de communications utilisant des objets de réalité augmentée

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18876651

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18876651

Country of ref document: EP

Kind code of ref document: A1