CN109923859B - Companion device for real-time collaboration in a teleconference session - Google Patents

Companion device for real-time collaboration in a teleconference session Download PDF

Info

Publication number
CN109923859B
CN109923859B CN201780067116.9A CN201780067116A CN109923859B CN 109923859 B CN109923859 B CN 109923859B CN 201780067116 A CN201780067116 A CN 201780067116A CN 109923859 B CN109923859 B CN 109923859B
Authority
CN
China
Prior art keywords
computing device
teleconference
streams
session
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201780067116.9A
Other languages
Chinese (zh)
Other versions
CN109923859A (en
Inventor
J·T·福尔克纳
K·欧哈拉
E·D·坎努托蒂伊尔
E·S·劳埃德林特尔
K·莫里森
R·克里什
A·维泽尔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Technology Licensing LLC
Original Assignee
Microsoft Technology Licensing LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Technology Licensing LLC filed Critical Microsoft Technology Licensing LLC
Publication of CN109923859A publication Critical patent/CN109923859A/en
Application granted granted Critical
Publication of CN109923859B publication Critical patent/CN109923859B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/1066Session management
    • H04L65/1083In-session procedures
    • H04L65/1093In-session procedures by adding participants; by removing participants
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/40Support for services or applications
    • H04L65/401Support for services or applications wherein the services involve a main real-time session and one or more additional parallel real-time or time sensitive sessions, e.g. white board sharing or spawning of a subconference
    • H04L65/4015Support for services or applications wherein the services involve a main real-time session and one or more additional parallel real-time or time sensitive sessions, e.g. white board sharing or spawning of a subconference where at least one of the additional parallel sessions is real time or time sensitive, e.g. white board sharing, collaboration or spawning of a subconference
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/40Support for services or applications
    • H04L65/403Arrangements for multi-party communication, e.g. for conferences
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/2866Architectures; Arrangements
    • H04L67/30Profiles
    • H04L67/303Terminal profiles
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/52Network services specially adapted for the location of the user terminal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/75Indicating network or usage conditions on the user display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M3/00Automatic or semi-automatic exchanges
    • H04M3/42Systems providing special services or facilities to subscribers
    • H04M3/42229Personal communication services, i.e. services related to one subscriber independent of his terminal and/or location
    • H04M3/42263Personal communication services, i.e. services related to one subscriber independent of his terminal and/or location where the same subscriber uses different terminals, i.e. nomadism
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M3/00Automatic or semi-automatic exchanges
    • H04M3/42Systems providing special services or facilities to subscribers
    • H04M3/56Arrangements for connecting several subscribers to a common circuit, i.e. affording conference facilities
    • H04M3/568Arrangements for connecting several subscribers to a common circuit, i.e. affording conference facilities audio processing specific to telephonic conferencing, e.g. spatial distribution, mixing of participants
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M7/00Arrangements for interconnection between switching centres
    • H04M7/0024Services and arrangements where telephone services are combined with data services
    • H04M7/0027Collaboration services where a computer is used for data transfer and the telephone is used for telephonic communication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • H04N7/147Communication arrangements, e.g. identifying the communication as a video-communication, intermediate storage of the signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/2866Architectures; Arrangements
    • H04L67/30Profiles
    • H04L67/306User profiles
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2203/00Aspects of automatic or semi-automatic exchanges
    • H04M2203/20Aspects of automatic or semi-automatic exchanges related to features of supplementary services
    • H04M2203/2094Proximity
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M3/00Automatic or semi-automatic exchanges
    • H04M3/42Systems providing special services or facilities to subscribers
    • H04M3/56Arrangements for connecting several subscribers to a common circuit, i.e. affording conference facilities
    • H04M3/563User guidance or feature selection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M3/00Automatic or semi-automatic exchanges
    • H04M3/42Systems providing special services or facilities to subscribers
    • H04M3/56Arrangements for connecting several subscribers to a common circuit, i.e. affording conference facilities
    • H04M3/563User guidance or feature selection
    • H04M3/564User guidance or feature selection whereby the feature is a sub-conference
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • H04N7/142Constructional details of the terminal equipment, e.g. arrangements of the camera and the display
    • H04N2007/145Handheld terminals

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Telephonic Communication Services (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Information Transfer Between Computers (AREA)
  • Computer Vision & Pattern Recognition (AREA)

Abstract

The present disclosure provides a variety of techniques for enhancing a user experience when joining a teleconference with multiple devices. When a user attempts to join a teleconference session using the same user identity for multiple devices, the system distinguishes these devices as a master device and at least one companion device. The master device has a first set of controls for controlling the teleconference session and at least one companion device has a set of companion controls for sharing content. In some embodiments, the master device also has a selected set of flows (e.g., stage view), and the companion device has a subset of those flows or other flows based on activity level. In addition, the present disclosure provides a variety of techniques for enabling users to share content using companion devices.

Description

Companion device for real-time collaboration in a teleconference session
Background
Communication and collaboration are key aspects in people's lives, including both social and business. Communication and collaboration tools have been developed to aim people to contact each other to share experiences. In many cases, the purpose of these tools is to provide an experience over a network that reflects real-life interactions between individuals and groups. Interaction is typically provided by audio and/or visual elements.
These tools include instant messaging, voice calls, video calls, group chatting, sharing desktops, sharing media and content, sharing applications, and so forth. These tools may perform the capturing, manipulating, transmitting, and rendering of audio and visual elements, as well as attempt to provide a collaborative environment using various combinations of these elements. The tools may be accessible by a user at a user device, which may be provided by a laptop or desktop computer, a mobile phone, a tablet device, a game console, and so forth. The devices may be linked in a variety of possible network architectures such as a peer-to-peer architecture or a client-server architecture or hybrids (e.g., a centrally managed peer-to-peer architecture). In general, some or all of these architectures may facilitate teleconferencing sessions in an attempt to achieve rich communications and collaboration somewhat similar to face-to-face collaborative conferences.
However, some current techniques may have a number of deficiencies when a user joins a teleconference. For example, when a user joins a teleconference session using a first device and later attempts to join the session using another device, existing systems may not be able to use all of the resources in an efficient manner. For example, in some existing systems, when a user joins a teleconference session using two devices, each device displays an exact copy of the session content. Thus, in this scenario, the second device cannot actually be used to enhance the user's interaction with the shared media or other session participants. Additionally, in some cases, when a user joins a teleconference session using more than one device, each device may cause audio interference.
Accordingly, there is a need for improved teleconferencing techniques that address these and other problems.
Disclosure of Invention
The present disclosure provides a variety of techniques for enhancing a user experience when joining a teleconference session with multiple devices. When a user attempts to join a teleconference session using the same user identity for multiple devices, the system distinguishes these devices as a master device and at least one companion device. The master device has a first set of controls for controlling the teleconference session and at least one companion device has a set of companion controls for sharing content. In some embodiments, the master device also has a selected set of flows (e.g., stage view), and the companion device has a subset of those flows or other flows based on activity level. In addition, the present disclosure provides a variety of techniques for enabling users to easily share content using companion devices.
In some configurations, a teleconference system may generate teleconference data that includes multiple streams associated with a teleconference session. The teleconference system may also receive, from a first computing device associated with a first user identity, a first request to join a teleconference session. The first user identity may be associated with a particular user or participant of the teleconference session. Further, the request can be instantiated at the first computing device using the user interface. The first computing device may be a primary device configured to anchor an immersive teleconferencing experience.
In response to the request, the teleconference system may transmit teleconference data to the first computing device to display or cause display of a main user interface. In one configuration, the main user interface includes a presentation of the plurality of streams. The presentation of the plurality of streams may include one or more presentations of individual streams arranged within the user interface. Such a display may be referred to herein as a "stage" view of a video conference session. Additionally, the teleconference system may also cause the first computing device to display a first set of controls to control aspects of the teleconference session. For example, a first set of controls (also referred to herein as core controls) may be used to and from a conversation, mute the sound of a conversation, and so forth.
The teleconference system may also receive a second request to join the teleconference session from a second computing device associated with the first user identity. The second computing device may be used by the same user associated with the first user identity. The teleconference system may cause the second computing device to display a secondary user interface with companion controls for sharing content. Additionally, the teleconference system may cause the second computing device to display a selection of streams that may be arranged to present a reduced or otherwise condensed portion of the teleconference data for mitigating or avoiding distraction, interference, and other elements that may disrupt the teleconference session.
In response to the second request, the teleconference system may select at least one stream of the plurality of streams based on the activity level or other value. Upon selection of the at least one stream, the teleconference system may transmit teleconference data to the second computing device, causing the second computing device to display a companion user interface. The companion user interface may include a presentation of the at least one stream. Thus, while the first device may receive the teleconference data with the plurality of streams, the second device may be configured to receive only the sub-part or the selected at least one stream. Thus, by efficiently transmitting sub-portions of the teleconference data that have relevant, and in some cases important, but non-repeating content, the teleconference system can overcome many of the technical deficiencies associated with conventional teleconference systems.
It should be noted that some other forms of establishing a teleconference session based on a single user identity and multiple user devices are described herein. According to one illustrative example, the teleconferencing system may also receive requests from a plurality of computing devices to join a teleconferencing session. The teleconference system may generate teleconference data associated with the teleconference session and distribute the teleconference data to the master device and the companion device similarly as described above or in any desired order.
Thus, somewhat similar to the aspects set forth above, the teleconferencing system may distinguish between multiple devices that join a teleconferencing session to identify the multiple devices associated with a single user identity. Thereafter, the teleconference system may transmit teleconference data for causing display of the plurality of streams on a primary device associated with the user, and transmit teleconference data for causing display of at least one selected stream in a companion device associated with the user.
According to another illustrative example, a first computing device may send a first request to join a teleconference session. The first computing device may be associated with a first user identity. In response to the request, the first computing device may receive first teleconferencing data that includes a plurality of streams associated with the teleconferencing session. The first computing device may also display a presentation of the plurality of streams.
Additionally, the second computing device may send a second request to join the teleconference session. The second computing device may also be associated with the first user identity. In response to the second request, the second computing device may receive second teleconferencing data that includes a subset of the streams of the plurality of streams. The second computing device may also display a presentation of the subset of streams of the plurality of streams. The subset of flows of the plurality of flows may be based, at least in part, on data indicative of an activity level associated with the subset of flows. Thus, in this example, when a user first joins a teleconference session using a first device, the user also subsequently introduces a second "companion" device to enhance the teleconference experience. Instead, it is also applicable that the user joins the teleconference session for the first time using the companion device and then joins the teleconference session using the first device.
In some aspects, the user may also use control commands associated with the master device and share commands associated with the companion device. The user may reconfigure the commands on each device in response to the user providing an input command (e.g., a reconfiguration command). Thus, the teleconference system can facilitate "device swapping" by effectively switching the master device and the companion device for the user. In some implementations, reverse swapping is also possible.
In addition, the user may also use control commands and reconfiguration commands to allow the device to seamlessly "exit" the teleconference session. In one example, the teleconferencing system may receive a reconfiguration command to cause a transition from the second computing device to the first computing device. In response to receiving the reconfiguration command, the teleconference system may stop transmitting teleconference data to the second computing device. Thus, the second computing device may be logged off and the first computing device may take over as the only master device associated with the first user identity.
Other types of commands for enriching a user experience are also disclosed. For example, the commands may allow any particular device to be restricted or augmented from one mode of operation to another. In one illustrative example, two or more modes of operation may be established by the teleconferencing system for the first user identity. These modes of operation may include, for example, a primary mode of operation and a companion mode of operation. Thus, once a device enters a teleconference session, different types of contextual user interfaces may be displayed to allow users to access and share relevant information based on these modes of operation. For example, a companion control menu may be displayed to enable various types of media data to be shared to be easily selected using the companion device. In addition, a main control menu may be displayed to enable control of the main device. The combination of control menus and the use of modes of operation enable a user to join a teleconference session in a manner that reduces distraction, while also allowing the user to select and share the most relevant content among more than one device.
As will be described in more detail below, in addition to establishing the master device and the master and companion operating modes, the teleconference system may manipulate components of the master and companion devices to enhance the teleconference session and limit disruption. For example, the teleconferencing system may remotely disable the camera components of one or more computing devices such that only a single "main" or "face view" camera is associated with a single user identity. Additionally, the teleconferencing system may remotely disable the audio components of one or more devices to limit audio feedback from the user devices. Other manipulations of these components may also be possible, for example, by selective activation to allow a user to share a unique view from the companion device using a "moving camera" on the companion device that is independent and unique from the camera view on the host device.
As will be described in greater detail below, the techniques described herein relating to a teleconferencing system may be combined in a variety of ways to enable a user to participate in a variety of scenarios, including scenarios in which the user has multiple devices for enriching a teleconferencing session.
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter. For example, the "term" techniques may refer to systems, methods, computer-readable instructions, modules, algorithms, hardware logic, and/or operations as permitted by the context described above and throughout this document.
Drawings
The detailed description describes embodiments with reference to the drawings. In the drawings, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference symbols in different drawings indicates similar or identical items.
FIG. 1 is a diagram illustrating an exemplary environment in which a system may operate to facilitate real-time collaboration of companion devices in a teleconference session.
FIG. 2 is a diagram illustrating components of an exemplary computing device configured to facilitate real-time collaboration of companion devices in a teleconference session.
Fig. 3A-3D illustrate some exemplary scenarios in which the techniques disclosed herein may be implemented using a computing device.
FIG. 4 illustrates an exemplary user interface arrangement of a host computing device.
5A-5B illustrate several exemplary user interface arrangements for companion computing devices.
6A-6C illustrate several exemplary user interface arrangements of companion computing devices in different forms.
7A-7D illustrate several exemplary user interface arrangements with shared content using a streaming camera view of a companion computing device.
FIG. 8 is a flow diagram of an exemplary method that facilitates real-time collaboration by companion devices in a teleconference session.
Detailed Description
The present disclosure provides a variety of techniques for enhancing a user experience when joining a teleconference session with multiple devices. Prior to joining the teleconference session, the teleconference system may generate teleconference data that includes multiple streams associated with the teleconference session. The teleconference system may also receive, from a first computing device associated with a first user identity, a first request to join a teleconference session. The first user identity may be associated with a particular user or participant of the teleconference session. Further, the request can be instantiated at the first computing device using the user interface. The first computing device may be a primary device configured to anchor an immersive teleconferencing experience.
In response to the request, the teleconference system may transmit teleconference data to the first computing device to display or cause display of a main user interface. In one configuration, the main user interface includes a presentation of the plurality of streams. The presentation of the plurality of streams may include one or more presentations of individual streams arranged within the user interface. Accordingly, the first computing device may display a presentation of one or more streams associated with the teleconference data.
The teleconference system may also receive a second request to join the teleconference session from a second computing device associated with the first user identity. The second computing device may be used by the same user associated with the first user identity. The second computing device may be similar in arrangement to the first computing device, or may be a different form of computing device. In general, the second computing device may be a computing device (e.g., a companion device) arranged to present a reduced or otherwise condensed portion of the teleconference data while avoiding distraction, interference, and other elements that may disrupt the teleconference session.
In response to the second request, the teleconference system may select at least one stream of the plurality of streams based on the activity level or other value. Upon selection of the at least one stream, the teleconference system may transmit teleconference data to the second computing device, causing the second computing device to display a companion user interface. The companion user interface may include a presentation of the at least one stream. Thus, while a first device may receive teleconferencing data having multiple streams, a second device may be configured to receive only a sub-portion or a selected at least one stream. Thus, by efficiently transmitting sub-portions of the teleconference data that have relevant, and in some cases important, but non-repeating content, the teleconference system can overcome many of the technical deficiencies associated with conventional teleconference systems.
It should be noted that several other forms of establishing a teleconference session based on a single user identity and multiple user devices are described herein. According to one illustrative example, the teleconferencing system may also receive requests from a plurality of computing devices to join a teleconferencing session. The teleconference system may generate teleconference data associated with the teleconference session and distribute the teleconference data to the master device and the companion device similarly as described above or in any desired order.
Thus, somewhat similar to the aspects set forth above, the teleconferencing system may distinguish between multiple devices that join a teleconferencing session to identify the multiple devices associated with a single user identity. Thereafter, the teleconference system may transmit teleconference data for causing display of the presentation of the plurality of streams, and transmit teleconference data for causing display of the presentation of at least one selected stream.
According to another illustrative example, a first computing device may send a first request to join a teleconference session. The first computing device may be associated with a first user identity. In response to the request, the first computing device may receive first teleconferencing data that includes a plurality of streams associated with the teleconferencing session. The first computing device may also display a presentation of the plurality of streams.
Additionally, the second computing device may send a second request to join the teleconference session. The second computing device may also be associated with the first user identity. In response to the second request, the second computing device may receive second teleconferencing data that includes a subset of the streams of the plurality of streams. The second computing device may also display a presentation of the subset of streams of the plurality of streams. The subset of flows of the plurality of flows may be based, at least in part, on data indicative of an activity level associated with the subset of flows. Thus, in this example, when a user joins a teleconference session for the first time using a first device, the user also introduces a second "companion" device to enhance the teleconference experience. Instead, it is also applicable that the user joins the teleconference session for the first time using the companion device and then joins later using the first device.
In some aspects, the user may also communicate the association between the devices into the primary and companion modes of operation using the control and reconfiguration commands. For example, the teleconferencing system may transmit different teleconferencing data to each device based on the control commands and the reconfiguration commands. Thus, the teleconference system can facilitate "device swapping" by effectively switching the primary operation mode and the companion operation mode for the devices for the user. In some implementations, reverse swapping is also possible.
In addition, the user may also use control commands and reconfiguration commands to allow the device to seamlessly "exit" the teleconference session. In one example, the teleconferencing system may receive a reconfiguration command to cause a transition from the second computing device to the first computing device. In response to receiving the reconfiguration command, the teleconference system may stop transmitting teleconference data to the second computing device. Thus, the second computing device may be logged off and the first computing device may take over as the only master device associated with the first user identity.
Other types of commands for enriching a user experience are also disclosed. For example, the commands may allow any particular device to be restricted or augmented from one mode of operation to another. In one illustrative example, two or more modes of operation may be established by the teleconferencing system for the first user identity. These modes of operation may include, for example, a primary mode of operation and a companion mode of operation. Thus, once a device enters a teleconference session, different types of contextual user interfaces may be displayed to allow users to access and share relevant information based on these modes of operation. For example, a companion control menu may be displayed to enable various types of media data to be shared to be easily selected using the companion device. In addition, a main control menu may be displayed to enable control of the main device. The combination of control menus and the use of modes of operation enable a user to join a teleconference session in a manner that reduces distraction, while also allowing the user to select and share the most relevant content among more than one device.
As will be described in more detail below, in addition to establishing the master device and the master and companion operating modes, the teleconference system may manipulate components of the master and companion devices to enhance the teleconference session and limit disruption. For example, the teleconferencing system may remotely disable the camera components of one or more computing devices such that only a single "main" or "face view" camera is associated with a single user identity. Additionally, the teleconferencing system may remotely disable the audio components of one or more devices to limit audio feedback from the user devices. Other manipulations of these components may also be possible, for example, by selective activation to allow a user to share a unique view from the companion device that is independent and unique from the camera view on the host device using a "streaming camera" on the companion device.
As will be described in more detail below, the techniques described herein relating to teleconferencing systems may be combined in a variety of ways to enable users to participate in a variety of scenarios, including the following scenarios: a user has multiple devices for joining a teleconference session while avoiding repeated experiences, improving the effective utilization of resources, and enriching the teleconference. Various examples, scenarios and aspects are described below with reference to fig. 1-8.
In fig. 1, a diagram illustrating an example of a teleconferencing system 100 is shown, wherein a system 102 is operable to provide a teleconferencing session 104 in accordance with an exemplary implementation. In this example, the remote conference session 104 is between a plurality of client computing devices 106(1) through 106(N) (where N is a positive integer having a value of two or more). The client computing devices 106(1) through 106(N) enable a user to participate in the teleconference session 104. In this example, the remote conference session 104 may be hosted by the system 102 over one or more networks 108. That is, the system 102 may provide a service that enables users of the client computing devices 106(1) through 106(N) to participate in the teleconference session 104. Alternatively, the remote conference session 104 may be hosted by one of the client computing devices 106(1) through 106(N) utilizing peer-to-peer technology.
System 102 includes device 110 and device 110, and/or other components of system 102 may include distributed computing devices in communication with each other, system 102, and/or with client computing devices 106(1) through 106(N) via one or more networks 108. In some examples, system 102 may be a stand-alone system responsible for managing aspects of one or more teleconference sessions 104. For example, the system 102 may be implemented by a computer system such as
Figure BDA0002043367880000091
GOOGLE
Figure BDA0002043367880000092
Etc. to be managed by the entity.
For example, the network 108 may include a public network, such as the Internet, a private network such as an organization and/or personal intranet, or some combination of a private network and a public network. The network 108 may also include any type of wired and/or wireless network including, but not limited to, a local area network ("LAN"), a wide area network ("WAN"), a satellite network, a wired network, a Wi-Fi network, a WiMax network, a mobile communication network (e.g., 3G, 4G, etc.), or any combination thereof. Network 108 may use communication protocols including packet-based and/or datagram-based protocols, such as internet protocol ("IP"), transmission control protocol ("TCP"), user datagram protocol ("UDP"), or other types of protocols. In addition, the network 108 may also include a number of devices that facilitate network communications and/or form the basis of network hardware, such as switches, routers, gateways, access points, firewalls, base stations, repeaters, backbones, and so forth.
In some examples, the network 108 may also include devices that enable connections to wireless networks, such as wireless access points ("WAPs"). Examples support connectivity by WAPs that transmit and receive data on various electromagnetic frequencies (e.g., radio frequencies), including WAPs that support institute of electrical and electronics engineers ("IEEE") 802.11 standards (e.g., 802.11g, 802.11n, etc.) and other standards.
In various examples, device 110 may include one or more computing devices operating in a cluster or other combinatorial configuration to share resources, balance load, improve performance, provide failover support or redundancy, or for other purposes. For example, device 110 may belong to various types of devices, such as a traditional server type device, a desktop type computer type device, and/or a mobile type device. Thus, although illustrated as a single type of device-a server type device-device 110 may comprise a wide variety of device types and is not limited to a particular type of device. Device 110 may represent, but is not limited to, a server computer, desktop computer, web server computer, personal computer, mobile computer, laptop computer, mobile phone, tablet computer, or any other type of computing device.
The client computing device (e.g., one of client computing devices 106(1) through 106(N)) may be of various types of devices (which may be the same as or different from device 110), such as a traditional client type device, a desktop computer type device, a mobile type device, an application-specific type device, an embedded device, and/or a wearable device. Thus, client computing devices may include, but are not limited to, desktop computers, game consoles and/or gaming devices, tablet computers, personal data assistants ("PDAs"), mobile phone/tablet hybrids, laptop computers, telecommunications devices, computer navigation-type client computing devices (e.g., satellite-based navigation systems including global positioning system ("GPS") devices), wearable devices, virtual reality ("VR") devices, Augmented Reality (AR) devices, implanted computing devices, an automotive computer, a network-enabled television, a thin client, a terminal, an internet of things ("IoT") device, a workstation, a media player, a personal video recorder ("PVR"), a set-top box, a camera, an integrated component for inclusion in a computing device (e.g., a peripheral device), an appliance, or any other kind of computing device. In some implementations, the client computing device includes an input/output ("I/O") interface capable of communicating with input/output devices, such as user input devices including peripheral input devices (e.g., game controllers, keyboards, mice, pens, voice input devices, touch input devices, gesture input devices, etc.) and/or output devices including peripheral output devices (e.g., displays, printers, audio speakers, tactile output devices, etc.).
The various classes and device types of client computing devices 106(1) through 106(N) may represent any type of computing device having one or more processing units 112 operatively connected (e.g., via a bus 116) to a computer-readable medium 114, which bus 116 may include, in some instances, one or more of a system bus, a data bus, an address bus, a PCI bus, a Mini-PCI bus, and any class of local, peripheral, and/or independent buses.
Executable instructions stored on computer-readable media 114 may include, for example, an operating system 128, a client module 130, a profile module 132, and other modules, programs, or applications that may be loaded and executed by processing unit 112.
The client computing devices 106(1) through 106(N) may also include one or more interfaces 134 for enabling communication with other input devices 148, such as a network interface, a camera, a keyboard, a touch screen, and a pointing device (mouse). For example, the interface 134 may also enable communication between the client computing devices 106(1) through 106(N) and other networked devices (e.g., devices of the device 110 and/or the system 102) over the network 108. Such a network interface 134 may include one or more Network Interface Controllers (NICs) or other types of transceiver devices to send and receive communications and/or data over a network.
In the exemplary environment 100 of fig. 1, the client computing devices 106(1) through 106(N) may use their respective client modules 130 to connect with each other and/or other external devices in order to participate in the teleconference session 104. For example, a first user may use a client computing device 106(1) to communicate with a second user of another client computing device 106 (2). When executing the client module 130, users may share data, which may cause the client computing device 106 to (1) connect to the system 102 and other client computing devices 106(2) through 106(N) to connect over the network 108.
The client module 130 of each client computing device 106(1) through 106(N) may include logic to detect user input and transmit control signals to the server module 136 to request a change of the teleconference session 104 on the display. For example, the client module 130 in the first client computing device 106(1) in fig. 1 may detect the user input at the input device 148. For example, user input may be sensed as: a finger press on a user interface element displayed on a touch screen (e.g., touch screen 150(2)), or a mouse click on a user interface element selected by a pointer on display 150. The client module 130 translates the user input according to the functionality associated with the selected user interface element. In some cases, the client module 130 may require the teleconference session 104 to perform functions. In this case, the client module 130 sends a control signal 156(1) to the server module 136 to be served by the teleconference session 104.
In one exemplary functionality, a user of the client computing device 106(1) may wish to cause a transition from the first mode of operation to the second mode of operation of the client computing device 106(1) for the teleconference session 104. The user may click on a desired user interface element on the user's display 150. In response, the client module 130 sends a control signal 156(1) to the server module 136. Server module 136 will, in response to control signals 156(1), perform the desired conversion based on teleconferencing data 146(1), data indicating the location of one or more other computing devices 106, and other appropriate information.
The client computing devices 106(1) through 106(N) may use their respective profile modules 132 to generate and provide participant profiles to other client computing devices and/or devices 110 of the system 102. The participant profile may include one or more of the following: the identity of the participant (e.g., name, user identity, unique identifier ("ID"), etc.), participant data (e.g., personal data and location data), and may also be stored. The participant profile may be used to register the participant with the teleconference session 104 and indicate a priority or preference associated with the user identity and/or the client computing device.
As shown in FIG. 1, the device 110 of the system 102 includes a server module 136, a data storage module 138, and an output module 140. The server module 136 is configured to receive streams 142(1) through 142(M) (where M is a positive integer equal to a value of 2 or greater) from respective client computing devices 106(1) through 106 (N). In some scenarios, not all of the client computing devices participating in the teleconference session 104 provide instances of the stream 142, and thus, M (the number of submitted instances) may not equal N (the number of client computing devices). In some other scenarios, one or more client computing devices may be transmitting additional streams or transmissions of media data that include content such as documents or other similar types of media intended to be shared during the teleconference session 104.
The server module 136 is also configured to receive, generate, and transmit session data 144, and store the session data 144 in the data storage module 138. In various examples, the server module 136 may select aspects of the stream 142 to be shared with the client computing devices 106(1) through 106 (N). Server module 136 can combine streams 142 to generate teleconference data 146 that defines aspects of teleconference session 104. Teleconference data 146 may include selection stream 142. Teleconference data 146 may define aspects of teleconference session 104, such as the user interface arrangement of the user interface on client computing device 106, the type of data displayed, and other functions of the server and client. Server module 136 may configure teleconference data 146 for each client computing device 106(1) through 106 (N). Teleconference data 146 may include various instances referenced as 146(1) through 146 (N). In addition, teleconference data 146 may include first teleconference data 142(1) for communicating with the master computing device, and second teleconference data 142(2) through 142(N) for communicating with the companion computing device. Output module 140 may transmit teleconference data instances 146(1) through 146(N) to client computing devices 106(1) through 106 (N). Specifically, in this example, output module 140 transmits teleconference data 146(1) to client computing device 106(1), teleconference data 146(2) to client computing device 106(2), teleconference data 146(3) to client computing device 106(3), and teleconference data 146(N) to client computing device 106(N), respectively.
Teleconference data instances 146(1) through 146(N) may convey audio, which may include video representing the contributions of each participant in teleconference session 104. Each of the remote conference data instances 146(1) through 146(N) may also be configured in a manner that is unique to the needs of each participating user of the client computing devices 106(1) through 106 (N). Each client computing device 106(1) through 106(N) may be associated with a remote conference session view. An example of using a teleconferencing session view to control a view at a client computing device for each participant is described with reference to fig. 2.
In fig. 2, a system block diagram is shown depicting components of an exemplary device 200, the exemplary device 200 configured to provide a remote conference session 104 between client computing devices (e.g., client computing devices 106(1) through 106(N)) according to an exemplary implementation. The device 200 may be used to illustrate some components of one of the client computing devices 106. Additionally, device 200 may represent one of devices 110, where device 200 includes one or more processing units 202, computer-readable media 204, and a communication interface 206. The components of device 200 may be operatively connected, for example, via a bus 207, which bus 207 may include one or more of the following: system bus, data bus, address bus, PCI bus, Mini-PCI bus, and various local, peripheral, and/or independent buses.
As used herein, a processing unit (e.g., processing unit 202 and/or processing unit 112) may represent, for example, a CPU-type processing unit, a GPU-type processing unit, a field programmable gate array ("FPGA"), another type of digital signal processor ("DSP"), or other hardware logic component that may be driven by a CPU in some instances. For example, and without limitation, illustrative types of hardware logic components that may be used include application specific integrated circuits ("ASICs"), application specific standard products ("ASSPs"), system on a chip ("SOCs"), complex programmable logic devices ("CPLDs"), and so forth.
As used herein, a computer-readable medium (e.g., computer-readable medium 204 and/or computer-readable medium 114) may store instructions that are executable by a processing unit. The computer-readable medium may also store instructions that are executable by an external processing unit, e.g., by an external CPU, an external GPU, and/or by an external accelerator such as an FPGA type accelerator, a DSP type accelerator, or any other internal or external accelerator. In various examples, at least one CPU, GPU, and/or accelerator is incorporated in the computing device, while in some examples, one or more of the CPU, GPU, and/or accelerator is external to the computing device.
Computer-readable media may include computer storage media and/or communication media. The computer storage medium may include one or more of the following: volatile memory, non-volatile memory, and/or other persistent and/or secondary computer storage media, removable and non-removable computer storage media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules, or other data. Thus, computer storage media includes media in tangible and/or physical form that is included in a device and/or hardware component (which is part of or external to the device), including, but not limited to, random access memory ("RAM"), static random access memory ("SRAM"), dynamic random access memory ("DRAM"), phase change memory ("PCM"), read only memory ("ROM"), erasable programmable read only memory ("EPROM"), electrically erasable programmable read only memory ("EEPROM"), flash memory, compact disc read only memory ("CD-ROM"), digital versatile discs ("DVD"), optical cards or other optical storage media, cartridges, tapes, magnetic disk storage, magnetic cards or other magnetic storage devices or media, magnetic cassettes or other magnetic storage devices or media, magnetic storage devices and/or hardware components that can be used to store and maintain information for access by a computing device, Solid state memory devices, storage arrays, network attached storage devices, storage area networks, hosted computer storage, or any other storage memory, storage device, and/or storage medium.
In contrast to computer storage media, communication media may embody computer-readable instructions, data structures, program modules, or other data in a modulated data signal (e.g., a carrier wave or other transport mechanism). As defined herein, computer storage media does not include communication media. That is, computer storage media does not include communication media consisting only of modulated data signals, carrier waves, or propagated signals.
Communication interface 206 may represent, for example, a network interface controller ("NIC") or other type of transceiver device for sending and receiving communications over a network.
In the illustrated example, the computer-readable media 204 includes the data storage module 138. In some examples, the data storage module 138 includes data stores such as databases, data warehouses, or other types of structured or unstructured data stores. In some examples, the data storage module 138 includes a corpus and/or relational databases having one or more tables, indexes, stored procedures, and/or the like to enable data access including, for example, one or more of a hypertext markup language ("HTML") table, a resource description framework ("RDF") table, a Web ontology language ("OWL") table, and/or an extensible markup language ("XML") table.
The data storage module 138 may store data for operating on processes, applications, components, and/or modules stored in the computer-readable medium 204 and/or executed by the processing unit 202 and/or accelerators. For example, in some examples, the data storage module 138 may store the session data 144, the profile data 210, and/or other data. Session data 208 may include the total number of users in teleconference session 104, as well as activities occurring in teleconference session 104 (e.g., participant behaviors, activities), and/or other data related to the time and manner in which teleconference session 104 is conducted or hosted. Examples of profile data 210 include, but are not limited to, user identity ("ID"), priority values, and other data.
In an exemplary implementation, the data storage module 138 stores data related to the view each user experiences on the display of the user's client computing device 106. As shown in fig. 2, data storage module 138 may include teleconference session modes 250(1) through 250(N) that correspond to the operating modes of each client computing device 106(1) through 106(N) participating in teleconference session 104. Teleconference session mode 250 may also be referred to herein as an "operational mode". Using teleconference session mode 250, teleconference system 102 may support separate control over the views and modes each user experiences on multiple devices during videoconference session 104. For example, as described in more detail below, the system 102 allows a user to participate with the user's client computing device 106 operating in one of several operating modes.
The modes of operation may include, for example, a master mode of operation and a companion mode of operation. Once the computing device 106 enters the teleconference session 104, different types of contextual user interfaces may be displayed on the display 150 to enable the user to access and share relevant information based on these operating modes. For example, a companion control menu may be displayed to enable a user to easily select various types of media data to be shared by the user's companion computing device. Additionally, a master control menu may be displayed to enable control of the state of the teleconference session 104 at the user's master computing device. The combination of control menus and the use of operational modes allows a user to join the teleconference session 104 with multiple devices in a manner that reduces distraction, while also allowing the user to select and share the most relevant content. In general, teleconference system 102 may select a master device to operate in a master mode of operation. Additionally, teleconference system 102 may select one or more companion devices to operate in a companion operating mode.
Using the techniques described herein, several methods for selecting a master device and thus establishing a master mode of operation and a companion mode of operation may be facilitated. For example, when associated with a single user identity, the primary device may be selected based on the order of devices attempting to join the teleconference session 104. Thus, the first device that the user attempts to join may be selected as the master device. In other aspects, teleconference system 102 may analyze priority values associated with one or more computing devices to determine whether a first computing device of a user has a priority value that supersedes priority values of other computing devices associated with the same user identity.
In addition to establishing the master device and the master and companion operating modes, teleconference system 102 may also manipulate components of the companion device to enhance teleconference session 104 and limit disruption. For example, teleconference system 102 may remotely disable a camera component (e.g., input device 148) of a companion computing device. Additionally, teleconference system 102 may remotely disable audio components (e.g., input devices 148) of the companion computing device. Other operations on these components may also be possible, for example, by selective activation to allow a user to "enter the arena" or to share content using a companion device instead of the primary device.
As described above, data storage module 138 may store profile data 210, streams 142, teleconference mode 250, teleconference data 146, and session data 144. Alternatively, some or all of the data referenced above may be stored on onboard separate memory 224 on one or more processing units 202, e.g., onboard memory on a CPU-type processor, GPU-type processor, FPGA-type accelerator, DSP-type accelerator, and/or another accelerator. In this example, the computer-readable media 204 also includes an operating system 226 and an application programming interface 228 configured to expose functionality and data of the device 110 (e.g., the example device 200) to external devices associated with the client computing devices 106(1) through 106 (N). Additionally, the computer-readable medium 204 includes one or more modules (e.g., the server module 136 and the output module 140), although the number of modules shown is by way of example only, and the number may be higher or lower. That is, the functionality described herein in connection with the illustrated modules may be performed by a fewer number of modules or a greater number of modules on a device, or distributed across multiple devices.
Thus, as previously described, in general, the teleconferencing system 102 is configured to host a teleconferencing session 104 having a plurality of client computing devices 106(1) through 106 (N). Teleconference system 102 includes one or more processing units 202 and computer-readable media 204, the computer-readable media 204 having computer-executable instructions encoded thereon to cause the one or more processing units 202 to receive streams 142(1) through 142(M) from a plurality of client computing devices 106(1) through 106(N) at system 102, select stream 142 based at least in part on a teleconference session mode 250 of each user's computing device, and transmit teleconference data 146 in accordance with teleconference session mode 250 corresponding to client computing devices 106(1) through 106 (N). Teleconference data 146(1) through 146(N) is transmitted from system 102 to a plurality of client computing devices 106(1) through 106 (N). Teleconference session modes 250(1) through 250(N) cause the plurality of client computing devices 106(1) through 106(N) to display a view of teleconference session 104 under user control. The computer-executable instructions also cause the one or more processing units 202 to determine that the teleconference session 104 is to transition the client computing devices 106(1) through 106(N) to a different teleconference session mode 250 based on the user-transmitted control signals 156 (also referred to herein as control commands 156) and other appropriate information. In some configurations, the control commands 156 include at least one of: user identity, a request to join the teleconference session 104, an access request, a transfer request, or other data described herein.
In some implementations, the techniques disclosed herein may use one or more predetermined modes of operation, which are also referred to as "modes" or "teleconference session modes". In exemplary operation, the system 102 performs a method comprising: receiving streams 142(1) through 142(M) at system 102 from a plurality of client computing devices 106(1) through 106(N), combining and formatting streams 142 to form teleconference data 146(1) through 146(N) based at least in part on a user identity of each client computing device, and sending teleconference data 146(1) through 146(N) to the respective client computing devices.
As will be described below, the predetermined operating modes may include a first operating mode (referred to herein as "primary") and a second operating mode (referred to herein as "companion"). In some implementations, the master mode of operation and the companion mode of operation can be automatically configured by intelligently selecting the master device and the companion device using profile data accessible by teleconference system 102. The predetermined operating mode may also facilitate providing graphical elements ("control elements") of control functionality for the teleconference session 104. For example, graphical elements may be generated on a user interface that enable a user to provide content, end a session, mute one or more sounds, control the flow of other participants, transition a particular device to a different mode of operation, and so forth.
In one illustrative example, the techniques disclosed below may use a master mode of operation. The primary mode of operation may be a "full function" form of operation that facilitates anchoring the immersive experience for a particular user. According to one aspect, only a single device associated with a user identity may be in a primary mode of operation. In the primary mode, an audio device such as a microphone may be enabled to receive audio information (e.g., sound, speech, etc.) from a user. Also in the master mode, a video device such as a front-facing camera may be enabled to receive video information (e.g., video recordings) from a user. Thus, the master mode of operation, and thus the device operating in the master mode of operation, may be fully functional and receive multiple forms of audio/video input from the participant.
In another illustrative example, the techniques disclosed herein may use a secondary mode of operation. The secondary mode of operation may be a form of "companion" operation that facilitates user engagement with multiple devices while not compromising the immersive experience provided by the primary device operating in the primary mode of operation. According to one aspect, one or more computing devices associated with a single user identity may operate in a companion mode of operation. In the companion mode, an audio capture device of the associated computing device may be disabled to mitigate audio feedback. Also in the companion mode, the camera device (e.g., front facing camera) may be disabled depending on whether video information is being actively shared from the associated device. In this manner, duplicate video and audio streams for the same user identity may be avoided while reducing network usage and increasing the efficiency of the teleconferencing system 102.
In general, the selection of the device 106 to operate in the primary and companion operating modes may be based on user identity and/or profile data 210 associated with the plurality of computing devices. For example, a user may join the teleconference session 104 using two or more computing devices 106 by issuing a request 156 from each computing device 106 to join the teleconference session 104.
In response to the request 156 or requests, the teleconference system 102 may select the first computing device 106(1) as the master computing device for operating in the master mode of operation. In general, the host computing device may be used to anchor the immersive experience for the user and limit the distractions caused by conventional teleconferencing systems.
After the selection, teleconference system 102 may generate teleconference data 146 associated with teleconference session 104. In one illustrative example, the teleconference data 146 may include participant streams 142 having videos of one or more participants, and content streams 142 having videos or images of files, data structures, word processing documents, and other sharable content.
After generating teleconference data 146, teleconference system 102 may transmit first teleconference data 146(1) to the host computing device and second teleconference data 146(2) through 146(N) to the companion computing device. Thus, while the master device is the only device that receives the first selection of teleconference data 146(1), the companion device receives at least a portion of the teleconference data (e.g., teleconference data 146(2) through 146N). Thus, by efficiently generating the primary and companion teleconference data 146 with relevant content, the teleconference system 102 of the illustrative example can overcome many of the technical deficiencies associated with conventional teleconference systems, including wasted bandwidth, a first option to over-use the computing resources of all client computing devices to process teleconference data, and other deficiencies.
In the following, a more detailed discussion of different scenarios involving exemplary teleconference session 104, various computing devices 106, and teleconference data 146 is described in detail. It should be understood that these exemplary scenarios do not limit all uses of the described technology.
Fig. 3A-3D illustrate a number of scenarios in which the techniques of the present disclosure may be used. As shown in fig. 3A, each user 301 is associated with one or more devices 106. The following description depicts a scenario in which individual users 301 join a teleconference session 104. The following example illustrates a scenario in which a device may start in a master mode or a companion mode.
In a first scenario, a first user 301A is associated with a first computing device 106A at a first location 311, the first computing device 106A for incoming calls to a teleconference session 104 that has communicated with a fifth computing device 106E used by a fourth user 301D and a fifth user 301E at a second location 310. Since the first user 301A is using a single device, the first computing device 106A may enter the teleconference session 104 in the primary mode of operation. Thus, the microphone and speaker of the first computing device 106a are turned on while the first computing device 106a is joining the teleconference session. Additionally, both the first computing device 106A and the fifth computing device 106E may operate in a master mode of operation. Further, both the first computing device 106A and the fifth computing device 106E may receive first teleconferencing data 146(1), which includes a first selection of teleconferencing data or streams 142.
In a second scenario, a first user 301A is joining the teleconference session 104 using a first computing device 106A, a second user 301B is joining the teleconference session 104 using a second computing device 106B, a third user 301C is joining the teleconference session 104 using a third computing device 106C and a fourth computing device 106D, a fourth user 301D is joining the teleconference session 104 using a fifth computing device 106E, and a fifth user 301E is joining the teleconference session 104 using a fifth computing device 106E. In general, the first computing device 106A, the second computing device 106B, and the third computing device 106C may all join the teleconference session 104 in the master mode of operation. However, as shown in fig. 3A, the third computing device 106C and the fourth computing device 106D are associated with a single user identity (e.g., associated with the third user 301C), and the third user 301C is joining the same teleconference session 104 using the third computing device 106C and the fourth computing device 106D. Thus, the third computing device 106C may join the teleconference session 104 in the primary mode of operation, while the fourth computing device 104D may join the teleconference session 104 in the companion mode of operation.
In a further example, if the third user 301C first joins the teleconference session 104 using the third computing device 106C, the third computing device 106C may join the teleconference session 104 in the master mode of operation. Subsequently, if the third user 301C also attempts to join the teleconference session 104 with the fourth computing device 106D, the fourth computing device 106D may join the teleconference session 104 in the companion mode of operation.
Additionally, as shown in fig. 3B, both the third computing device 106C and the fourth computing device 106D may initially join the teleconference session 104 in the master mode of operation. Thereafter, upon determining that third computing device 106C and fourth computing device 106D are both detected as being associated with the same user identity (e.g., the user identity of fourth user 301D), teleconference system 102 may transition third computing device 103C to the primary mode of operation for user 301D and may transition fourth computing device 104D to the companion mode of operation. As shown, the companion mode of operation may include different user interface arrangements, presentation of streams, and other aspects as compared to the main mode of operation.
As described herein, a master device may be selected based on one or more factors. For example, if the third computing device 106C is the first computing device to enter a teleconference session for a particular user identity, the third computing device 106C may become the master device. The third computing device 106C may also be selected as the master device based on any suitable data (e.g., priority, user preferences, or user input). The third computing device 106C may also be selected as the master device based on the device type. For example, if the third computing device 106C is a laptop computer and the fourth computing device 106D is a mobile phone, the third computing device 106C may be selected as the master device based on the device type of the third computing device 106C having a possible increased capability compared to the fourth computing device 104D. Other considerations in selecting the master device may also be applicable.
Upon selecting or determining the third computing device 106C as the master device, the teleconference system 102 may also select a fourth computing device 106D, as shown in fig. 3B. As further shown, each of the master device and the companion device is configured to operate in a master mode of operation or a companion mode of operation, respectively.
Continuing with this scenario, the techniques described herein also allow for selection of data streams to be presented at the companion device and the master device. For example, as shown in fig. 3C, master device 106C may include a presentation of multiple streams 142 and a set of core teleconference control elements. Further, companion device 106D may include a presentation of the selected plurality of streams 142(2) and a set of companion teleconference control elements.
Additionally, as shown in fig. 3D, master device 106C may include a presentation of multiple streams 142 and a set of core teleconference control elements. Further, companion device 106D may include a presentation of selected content streams 142(2) based on the activity level and a set of companion teleconference control elements. Other variations and combinations of these above example scenarios and interface arrangements are also applicable.
Furthermore, the techniques described herein may also be applied to users joining from a remote location. As shown in fig. 3A, a sixth user 301F is joining the teleconference session 104 using a sixth computing device 106F and a seventh computing device 106B. In this example, the sixth user 301F is located at a remote location 312 that is separate from the second location 310. When the sixth user 301F uses the sixth computing device 106F to join the teleconference session 104, the sixth computing device 106F may join the teleconference session 104 in the master mode, such that the sixth user 301F uses the sixth computing device 106F to participate in the fully immersive experience. Additionally, the seventh computing device 106G may join the teleconference session 104 in a companion mode of operation. Still further, the sixth user 301F may switch functionality between each device, causing the device "swap" operational mode to change from the primary mode to the secondary mode, and vice versa.
As summarized above, the techniques disclosed herein provide different modes of operation for a teleconferencing session. Additionally, a user interface and user interface arrangement may be provided for enabling users to access and share related information based on the primary and companion operational modes to further enhance the teleconferencing system 102. Several examples of these user interfaces are shown in fig. 4, 5A-5B, and 6A-6C. In particular, fig. 4 shows aspects of a first user interface 400 that may be displayed on a device 106 in communication with a telecommunications session 104. The exemplary user interface 400 may be displayed on a device 106 (e.g., a desktop computer) operating in a primary mode of operation.
As shown in fig. 4, the first user interface 400 includes a main portion 402 that provides a arena for the teleconference session 104. Selected media data (e.g., streaming video, files, and applications) shared in teleconference session 104 may be displayed in main portion 402 of the user interface. As shown, the main portion 402 may display various participant streams and/or content streams 142 from the teleconference session 104. Accordingly, the main portion 402 displays the presentation of the first teleconferencing data described above. First teleconferencing data 146(1) may generally include a first selection of teleconferencing data as compared to a reduced sub-portion of data transmitted to a companion device.
The first user interface 400 may also include a plurality of core teleconference control elements 403, which may be configured to control aspects of the teleconference session 104. In general, core teleconference control element 403 is configured to control the state of teleconference session 104. For example. A first button of core teleconference control element 403 may disconnect device 106 from teleconference session 104. A second button of core teleconference control element 403 may control the microphone (i.e., mute button) of device 106. A third button of core teleconference control element 403 may control the camera of device 106 (i.e., turn the camera on or off). A fourth button or any positioned control element may also be involved in the switching function. In response to receiving a user actuation of the fourth button, the primary device may transition to become the secondary device. In some configurations, the display of the core teleconference control element 403 may fade in and out over a predetermined period of time. In such a configuration, core teleconference control element 403 may be redisplayed based on one or more actions (e.g., hovering over or another suitable user interaction within a predetermined portion of user interface 400).
When device 106 is connected to a teleconference session as a master device (e.g., when the master device is a single device or a device associated with a single user identity), a menu may be displayed with an arrangement of selectable elements that provide access to various types of media. Media may include, but is not limited to: desktop sharing, application sharing, or file sharing. The user interface 400 may be configured to allow a user to scroll additional selectable elements by sliding or scrolling selectable elements to the left or right. In addition, when the user provides a gesture, such as a swipe up, the display of selectable elements may be expanded. Accordingly, there are various user interface arrangements for the master device in order to allow fully immersive control of the teleconference session.
Turning now to fig. 5A-5B, several graphical user interfaces associated with a companion device in a companion mode of operation are described in detail. FIG. 5A illustrates aspects of a companion user interface 500 for a companion device. In general, user interface 500 includes a simplified user interface configured to display a sub-portion of teleconferencing data 146 based, at least in part, on an activity level. The example of fig. 5A shows a control menu with a shared control element 501 for displaying options for sharing media and/or options for promoting a companion device to a host device. In addition, the interface 500 also includes a conversation stage 502 that is somewhat similar to the conversation stage 402 described above.
For example, the first control 501A is configured to activate a camera device. In response to receiving a user actuation of the first control 501A, the camera of the device is activated, the live video feed from the camera may be presented "on-stage" (e.g., if permitted by the master device), and the presentation of the real-time feed is displayed on other devices in communication with the teleconference session 104. The second control 501B is configured to activate file sharing and application sharing functions. In response to receiving a user actuation of the second control 501B, the secondary device enables the user to select an application or file to be shared with the teleconference session 104. Once the file or application is selected, the rendering of the selected media may be presented "on stage," e.g., the presentation of the selected media is transmitted to a device in communication with the remote conference session 104.
The third control 501C is configured to activate the conversion function. In response to receiving a user actuation of third control 501C, the companion device may become the master device based on considerations of teleconferencing system 102. For example, if device 106 is associated with a particular user identity, receives a user actuation of third control 501C, and there is no longer a master device associated with that user identity, teleconference system 102 may cause that device to switch to a master operational mode to maintain the immersive experience of the user. Transitioning to the master mode may also be facilitated by demoting an existing master device associated with the user identity to the companion operating mode to allow the initial companion device to become the user's master device. Accordingly, several implementations of the exchange from the companion mode to the master mode may be facilitated by the techniques described herein.
The fourth control 501D provides connection control to the teleconference session 104, for example, the fourth control 501D may be a "hang" or disconnect button. These examples are provided for illustrative purposes only and should not be construed as limiting. It should be appreciated that other controls may be displayed that provide additional functionality for the companion device.
As shown, the interface 500 does not have a core teleconference control element for controlling a microphone or other form of audio communication, as in some implementations, the companion device is not configured to transmit audio signals. In addition, as shown in FIG. 5B, other variations of the interface 500 including a missing or hidden companion set of control elements are also applicable.
Contextual menu controls, functions, searchable content, recent or contextual content may be open state for many different conversational and collaborative core control needs such as: sharing, recording, remote control, pstn/voip calls, sensor detection, device detection/selection, etc. The layout configuration of the control menu can also be adapted to different screen layouts: top, bottom, sides, etc. The feasibility may also be influenced by different gestures: swipe up, swipe down, swipe left, swipe right, tap, double tap, click and hold, and so on.
In some configurations, the companion device may select different types of media data for display. For example, the companion device may display the received teleconference data based on the activity level. In one illustrative example, as shown in the example of fig. 5B, if the primary view of the primary device is displaying a participant stream, the participant stream may be displayed on the primary portion 502 of the user interface 500 on the companion device. Then, because a different content stream or participant stream is displayed on the master device, only a subset of the teleconference data is transmitted to the companion device. Thus, reduced bandwidth usage is achieved while also allowing a fully immersive teleconferencing experience.
Turning now to FIG. 6A, aspects of another user interface (interface 600) for a device 106 operating in a companion mode of operation are described. In this particular example, it should be understood that the particular interface 600 is an illustrative example representing a user interface on a mobile device (e.g., a tablet device, mobile phone, or other similar device). It should be appreciated that similar aspects may exist for interface 600 as compared to interface 500. Therefore, a detailed description of similar elements is omitted herein for the sake of brevity.
As shown, the user interface 600 may include a control menu with shared control elements 651 for providing access to various types of media. The shared control element 651 may be arranged to function similarly to the shared control element 501 described in detail above. For example, a first selectable element 651A may provide access to a particular file. A second selectable element 651B may provide access to desktop sharing and a third selectable element 651C may provide access to application sharing. These examples are provided for illustrative purposes only and should not be construed as limiting. It should be appreciated that optional element 651 may provide access to any suitable type of shareable media and is arranged similarly to optional element 501.
As shown in fig. 6A, interface 600 may also include a display of a session stage 652, which is a video presentation of the second teleconferencing data, or a sub-portion of the teleconferencing data, transmitted to the client device. The display of the conversation stage 652 may show salient features of the conversation, such as the most dominant speaker or shared content. When the user actuates a graphical element (e.g., a chevron pattern displayed in the center of the bottom of the interface), the display of the conversation stage 652 can expand to show additional sharing options available to the companion device. For example, as shown in fig. 6B and 6C, an example of a fully-slid configuration of a user interface 600 of a mobile device operating in a companion mode is provided.
As shown in fig. 6B, user interface 600 may include a control menu with shared control elements 651A-651I for providing access to various types of media. For example, the optional elements 651A-65lI may provide access to specific files, desktop shares, and applications. These examples are provided for illustrative purposes only and should not be construed as limiting. It should be appreciated that the optional elements 651A-551I may provide access to any suitable type of shareable media. It should also be appreciated that interface 600 may have any suitable number of optional elements 651. Additionally, the user may scroll through additional selectable elements 651A-651I by actuating the arrows to view additional selectable elements 651. The user interface 600 may also include an "add" button to enable a user to add content (e.g., files, applications, and other content) to the menu. The displayed controls and content may be scaled in row and column behavior to accommodate different sized devices, windows, user needs, and the like.
The user interface 600 may also provide access to the display of the conversation stage by using gestures (e.g., a swipe down applied to the first UI element 662). When a predetermined gesture is received (e.g., applying a swipe down to the first UI element 662), the user interface 600 transitions to the example user interface 600 shown in fig. 6C, which includes a video presentation of the second teleconferencing data 146(2) presented on the conversation stage 652. While in this mode, other sharing controls 664 (e.g., a send button or a share button) may be located in the user interface 600. The user interface 600 may be configured with or without a sharing control. In response to receiving the predetermined input gesture (e.g., an upward swipe at the second UI element 663), the user interface 600 shown in fig. 6C may transition back to the user interface 600 shown in fig. 6B.
While some devices have been described as "secondary" devices or "companion" devices, it should be understood that some of the functionality of the companion device may be utilized to allow for additional interaction and shareable content, much like an extension of the host device. For example, as shown in fig. 7A-7D, the companion device may implement a shared camera or "streaming camera" functionality to allow conference participants to provide a real-world view of content and the surrounding environment while not interfering with the associated master device. In this example, companion device 106D includes a camera 781 adapted to stream video data to teleconference session 104. When the user points the camera of device l06D (referred to herein as the "streaming camera") at the content 780, device 106D generates a video stream or photograph that is transmitted to the server module 136 to distribute the content. The shared content may include a video stream or still images.
Referring to the example of FIG. 3, FIG. 7A illustrates a fourth computing device 106d operating in a companion operating mode and generating image data via a camera 781. The image data is shared with devices of the teleconference session 104 (e.g., the first computing device 106A and the second computing device 106B). Any content including whiteboards, drawings, photos, pages of a book, or any other content from the simulated environment may be shared. Thus, in this example, the teleconference system 102 may receive a request from the camera 781 of the computing device 106D to share video content. Additionally, teleconference system 102 may add video content to teleconference data and sub-portions of teleconference data for sharing among devices participating in teleconference session 104.
Continuing with this example, any user in an online conference (e.g., teleconference session 104) can send live video or still images to a shared video stage that treats streaming camera video as a content stream rather than a traditional self-centric participant stream (also referred to as 'speaker-head' video). The user may use the companion device to trigger this feature to share real-world content or other content with the conference participant. As shown in fig. 7B, the content 780 may be displayed at any master device (e.g., master device 106E) using the overlay view. Other views (e.g., persistent views) may also be applicable. Other devices associated with teleconference session 104 may also receive and display content 780, as shown in fig. 7C and 7D. Thus, while the companion device may be used to receive second teleconference data 146(2) or a sub-portion of teleconference data 146, the companion device may also be used to augment the teleconference session.
As described in detail above, real-time mediated collaboration, e.g., via an online conference, video conference, or document collaboration system, is typically achieved using a single device per user ID. In current collaboration systems, if multiple devices join and are associated with a single user identity at the same time, the user experience may be essentially repetitive across devices, and some inefficiencies including excessive use of bandwidth and use of multiple audio devices per user identity are apparent. This may have deleterious consequences for the user experience. In such cases, the audio feedback is preferably presented by "echoes" and "ghosts" (ghesting) of the audio stream. In the worst case, this can cause technical and practical problems, including but not limited to: race conditions for synchronization, notification, control execution, audio echo and howling, battery consumption, and additional bandwidth usage. The present invention focuses on adding second and third, and up to n devices associated with a single user identity that can be automatically distinguished upon joining in order to augment the primary device experience rather than making copies. The present invention distinguishes the state of any master device in a real-time collaboration session from the state of any companion device joining the same session (when they are associated with a single user identity).
A more detailed description of the operation of the teleconference system 102 is provided below with reference to FIG. 8, and FIG. 8 shows a flowchart of an exemplary method 800 of providing companion devices for real-time collaboration in a teleconference session.
The method 800 may include: at block 802, teleconference data 146 that includes multiple streams 142 associated with teleconference session 104 is generated. For example, the plurality of streams 142 may be received from a plurality of computing devices 106 associated with the teleconference session 104.
The method 800 may further include: at block 804, a first request to join the remote conference session 104 is received from a first computing device (e.g., the third computing device 106C of fig. 3A) associated with a first user identity (e.g., the third user 301C of fig. 3). In general, the request may be substantially similar to the access request, configuration command, and/or reconfiguration command (e.g., CTL 156(1)) described in detail above with reference to fig. 1.
The method 800 may further include: at block 806, the teleconference data 146 is transmitted to the first computing device 106C, causing the first computing device (e.g., computing device 106C) to display the main user interface 400. The main user interface 400 may include presentation of multiple streams 142. For example, as shown in fig. 3B and 4, the user interface 400 may also be arranged to display multiple streams or a single stream. Additionally, user interface 400 may include a plurality of core teleconference control elements 403. Core teleconference control element 403 may include user interface elements configured to control the state of a first computing device (e.g., device 106C) and the state of teleconference 104.
Thereafter, or substantially simultaneously with receiving the first request, the method 800 includes: at block 808, a second request to join the teleconference session 104 is received from a second computing device (e.g., device 106D) associated with the first user identity (e.g., user 301C). For example, user 301C may attempt to join teleconference 104 from device 106C and device 106D at substantially the same time. Additionally, a user (e.g., user 301C) may attempt to first join the teleconference session 104 with device 106C. Subsequently, user 301C can attempt to join teleconference session 104 with companion device 106D.
In response to the second request, the method 800 may include: at block 810, at least one flow of the plurality of flows 142 is selected based on the activity level. In general, when configuring or generating teleconference data 146, first teleconference data 146(1), and/or second teleconference data 146(2) through 146(N), a sub-portion of the teleconference data, or second teleconference data 146(2) through 146(N), may be selected based on the activity levels of the data presented by the participants of teleconference session 104. The video or shared content in the media data for each participant may be analyzed to determine the activity level of any streams 142(1) through 142(M) received at the teleconference system 102. The activity level may be based on any type of activity including, but not limited to, any combination of:
1. the participant acts as follows: the degree to which the participant moves in the video may determine the activity level of the participant. A participant in the process of gesturing or otherwise moving through a video may be considered to be participating at a relatively high level in a teleconference.
2. Lip movements of participants: the video may be analyzed to determine the extent to which the participant's lips have moved as an indication of the extent to which the participant is speaking. A participant who is speaking at a relatively high level may be considered to be participating at a correspondingly relatively high level.
3. Facial expression of participants: the video of the participant may be analyzed to determine changes in facial expression or pattern recognition may be used to determine specific facial expressions. It can be considered that participants who react by facial expressions in a teleconference are participating at a relatively high level.
4. Content modification: a video of the content being shared in the teleconference can be analyzed to determine whether it is being modified. Thus, recently or actively modified content may have a high activity level.
5. Turning pages of contents: a video of the content being shared may be analyzed to determine if there is, for example, a page turn of the document, and a corresponding activity level assigned thereto.
6. Number of participants with content in display area: the activity level may be assigned to the video of the content being shared based on the following number: the number of participants having a view of the content in the display area.
7. A participant entering a teleconference: a high activity level may be assigned to media data from a participant entering a teleconference. The value may be based on the order in which the participants joined the session.
8. Participant leaving the teleconference: a low activity level may be assigned to media data from participants who are leaving the teleconference.
Accordingly, participant streams and/or content streams having relatively high activity levels may be selected for display in any of the companion user interface arrangements described above via transmission of second teleconference data 146(2) through 146(N) or a sub-portion of teleconference data 146 having a relatively high activity level. It should also be understood that the activity level may be replaced based on other logic, for example, if a user with multiple devices desires a fixed view for the companion computing device, or other scenarios.
Upon selecting at least one stream, the method 800 further comprises: at block 812, the teleconference data 146 is transmitted to the second computing device 106D, causing the second computing device 106D to display the companion user interface 500. The companion user interface 500 may include a presentation of the at least one stream. It should be appreciated that companion user interface 500 may include the presentation of multiple streams, content streams, participant streams, or any combination thereof. Additionally, companion user interface 500 can include a plurality of companion teleconference control elements 501. Companion teleconference control element 501 may include a user interface element configured to provide content sharing functionality. In general, the content sharing functionality enables the second computing device 106d to share at least one of: a selected file, a desktop view, or a stream generated by the camera 781 of the second computing device 106D.
According to other examples of implementing the method 800, the plurality of streams 142 may also include at least one participant stream and at least one content stream, as described above. Thus, in scenarios in which the teleconference system 102 processes participant streams and content streams, the teleconference system 102 may also transmit the at least one participant stream and the at least one content stream to the first computing device 106C, and the at least one content stream to the second computing device 106D. In this manner, reduced use of bandwidth may be achieved by intelligently selecting streams 142 for transmission to the master and companion devices.
While generally described in relation to the teleconferencing system 102, it should be readily understood that the various devices under the control of the user may also implement one or more of the aspects described above. For example, user 301C may also use multiple devices to request to join the teleconference session 104. In this regard, user 301C may send a first request 156 from a first computing device 106C associated with a first user identity 301C to join teleconference 104. In response to the request 156, the first computing device 106C may receive first teleconference data 146(1) that includes the plurality of streams 142 associated with the teleconference session 104 and display a presentation of the plurality of streams 142.
Similarly, user 301C can send a second request to join the teleconference session 104 from a second computing device 106D associated with the first user identity 301C. In response to the second request, the second computing device 106d may receive second teleconferencing data 146(2) that includes the subset of streams of the plurality of streams 142 and display a presentation of the subset of streams of the plurality of streams 142. The flow subsets of the plurality of flows 142 are based at least in part on data indicative of activity levels associated with the subsets of flows 142, as described in detail above.
Accordingly, several aspects of the disclosure may be implemented by the teleconferencing system 102 and the various computing devices 106. In addition, once connected to teleconference session 104, user 301C may send several commands using the user interface elements shown in FIGS. 4 and 5A. For example, user 301C may send one or more core teleconference control requests from first computing device 106C to control the state of teleconference session 104. Similarly, user 301C can send one or more companion teleconference control requests from second computing device 106D to control the content sharing functionality of second computing device 106D. Other forms of request and control, including the "exchange" function described above, are also applicable.
As described in detail herein, several aspects of enhancing a teleconference session by facilitating multiple devices of a user have been provided. Although the techniques have been described in language specific to structural features and/or methodological acts, it is to be understood that the appended claims are not necessarily limited to the described features or acts. Rather, the features and acts are described as exemplary implementations of the techniques.
Operations of exemplary processes are illustrated in respective blocks and are summarized with reference to the blocks. The processes are illustrated as logical flows of blocks, where each block may represent one or more operations that may be implemented in hardware, software, or a combination thereof. In the context of software, the operations represent computer-executable instructions stored on one or more computer-readable media, which when executed by one or more processors, enable the one or more processors to perform the recited operations. Generally, computer-executable instructions include routines, programs, objects, modules, components, data structures, and so forth that perform particular functions or implement particular abstract data types. The order in which the operations are described should not be construed as a limitation, and any number of the described operations can be performed in any order, combined in any order, sub-divided into multiple sub-operations, and/or performed in parallel to implement the described processes. The described processes may be performed by resources associated with one or more devices (e.g., one or more internal or external CPUs or GPUs) and/or one or more hardware logic (e.g., FPGAs, DSPs, or other types of accelerators).
All of the methods and processes described above may be embodied in software code modules executed by one or more general purpose computers or processors and are fully automated. The code modules may be stored in any type of computer-readable storage medium or other computer storage device. Alternatively, some or all of the methods may be embodied in dedicated computer hardware.
Conditional language such as "may," "can," "might," or "meeting" should be understood within the context that certain examples include certain features, elements, and/or steps, while other examples do not include certain features, elements, and/or steps, unless specifically stated otherwise. Thus, such conditional language is not generally intended to imply the following: one or more examples may require certain features, elements, and/or steps in any way or may include logic for deciding, with or without user input or prompting, whether certain features, elements, and/or steps are to be included or are to be performed in any particular example. Unless specifically stated otherwise, connection language such as the phrase "X, Y or at least one of Z" should be understood to indicate that an item, clause, etc. may be X, Y or Z, or a combination thereof.
Any routine descriptions, elements, or blocks in the flow diagrams described herein and/or depicted in the figures should be understood as potentially representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or elements in the routine. Alternative implementations are also included within the scope of the examples described herein, in which elements or functions may be deleted or performed in a different order than shown or discussed (including substantially synchronously or in an opposite order), depending on the functionality involved, as would be understood by those of ordinary skill in the art. It should be emphasized that many variations and modifications may be made to the examples described above, the elements of which are to be understood as other acceptable examples. All such modifications and variations are intended to be included herein within the scope of this disclosure and protected by the following claims.
Clause:
the numerous features, aspects, and/or details explained above can be expressed in one or more terms as defined below:
clause 1: a method, comprising: generating teleconference data (146) including a plurality of streams (142) associated with a teleconference session (104); receiving, from a first computing device (106C) associated with a first user identity (301C), a first request to join the teleconference session (104); transmitting the teleconference data (146) to the first computing device (106C), causing the first computing device (106C) to display a main user interface (400), the main user interface (400) including a presentation of the plurality of streams (142); receiving a second request to join the teleconference session (104) from a second computing device (106D) associated with the first user identity (301C); selecting at least one flow of the plurality of flows (142) based on the activity level; and transmitting the teleconference data (146) to the second computing device (106D), causing the second computing device (106D) to display a companion user interface (500), the companion user interface (500) including a presentation of the at least one stream.
Clause 2: the method of clause 1, wherein the main user interface includes a plurality of core teleconference control elements including user interface elements configured to control a state of the first computing device and a state of the teleconference session.
Clause 3: the method of any of the above clauses wherein the companion user interface comprises a plurality of companion teleconference control elements including user interface elements configured to provide content sharing functionality, wherein the content sharing functionality enables the second computing device to share at least one of: a selected file, a desktop view, or a stream generated by a camera of the second computing device.
Clause 4: the method of any of the preceding clauses wherein the activity level represents a level of activity for the at least one flow.
Clause 5: the method of any of the preceding clauses wherein the plurality of streams includes at least one participant stream and at least one content stream, wherein the method further comprises: transmitting the at least one participant stream and the at least one content stream to the first computing device; and transmitting the at least one content stream to the second computing device.
Clause 6: the method of any of the above clauses wherein the at least one participant stream comprises user media data of the teleconference session and the at least one content stream comprises content media data of the teleconference session.
Clause 7: a method, comprising: sending, from a first computing device (106C) associated with a first user identity (301C), a first request to join a teleconference session (104); receiving, at the first computing device (106C), first teleconferencing data (146(1)) comprising a plurality of streams (142) associated with the teleconferencing session (104); displaying a presentation of the plurality of streams (142) on the first computing device (106C); sending, from a second computing device (106D) associated with the first user identity (301C), a second request to join the teleconference session (104); receiving, at the second computing device (106D), second teleconferencing data that includes a stream subset (142) of the plurality of streams (142); and displaying, on the second computing device (106D), a presentation of the subset of streams (142) of the plurality of streams (142), wherein the subset of streams (142) of the plurality of streams (142) is based at least in part on data indicative of an activity level associated with the subset of streams (142).
Clause 8: the method of any of the preceding clauses wherein the method further comprises: sending one or more core teleconference control requests from the first computing device to control a state of the teleconference session.
Clause 9: the method of any of the preceding clauses wherein the method further comprises: sending one or more companion teleconference control requests from the second computing device to control content sharing functionality of the second computing device.
Clause 10: the method of any of the above clauses wherein the plurality of streams includes a plurality of participant streams and a plurality of content streams, wherein the plurality of participant streams includes user media data of the teleconference session and the plurality of content streams includes content media data of the teleconference session.
Clause 11: a method, comprising: generating teleconference data (146) including a plurality of streams (142) associated with a teleconference session (104); receiving, from a first computing device (106C) associated with a first user identity (301C), a first request to join the teleconference session (104); receiving a second request to join the teleconference session (104) from a second computing device (106D) associated with the first user identity (301C); transmitting the teleconferencing data (146) to the first computing device (106C); and transmitting a sub-portion (146(2)) of the teleconferencing data to the second computing device (106D), wherein the sub-portion (146(2)) comprises at least one stream selected from the plurality of streams (142) based on an activity level.
Clause 12: the method of any of the preceding clauses wherein the method further comprises: receiving a control command to cause a transition of the first computing device and the second computing device; and in response to receiving the control command, transmitting the teleconferencing data to the second computing device to display a presentation of the plurality of streams; and transmitting the sub-portion of the teleconferencing data to the first computing device to display a presentation of the at least one stream.
Clause 13: the method of any of the preceding clauses wherein the method further comprises: receiving a reconfiguration command to cause a transition to the first computing device; and in response to receiving the reconfiguration command, ceasing transmission of the sub-portion of the teleconference data to the second computing device.
Clause 14: the method of any of the preceding clauses wherein the method further comprises: disabling a camera component of the second computing device.
Clause 15: the method of any of the preceding clauses wherein the method further comprises: disabling an audio component of the second computing device.
Clause 16: the method of any of the preceding clauses wherein the method further comprises: receiving a request to share video content from a camera of the second computing device; and adding the video content to the teleconference data and the sub-portion of the teleconference data.
Clause 17: a system comprising one or more processing units (202), and a computer-readable medium (204) having computer-executable instructions encoded thereon to cause the one or more processing units (202) to: generating teleconference data (146) including a plurality of streams (142) associated with a teleconference session (104); receiving, from a first computing device (106C) associated with a first user identity (301C), a first request to join the teleconference session (104); transmitting the teleconference data (146) to the first computing device (106C) to display a main user interface (400), the main user interface (400) including a plurality of core teleconference control elements (403); receiving a second request to join the teleconference session (104) from a second computing device (106D) associated with the first user identity (301C); selecting at least one flow (142) of the plurality of flows (142) based on the activity level; and transmitting the teleconference data (146) to the second computing device (106D), causing the second computing device (106D) to display a companion user interface (500), the companion user interface (500) including a plurality of companion teleconference control elements (501).
Clause 18: the system of clause 17, wherein the plurality of streams includes at least one participant stream and at least one content stream, wherein the computer-executable instructions further cause the one or more processing units to: transmitting the at least one participant stream and the at least one content stream to the first computing device; and transmitting the at least one content stream to the second computing device.
Clause 19: the system of any of the preceding clauses wherein the computer-executable instructions further cause the one or more processing units to: receiving a reconfiguration command to cause a transition to the first computing device; and in response to receiving the reconfiguration command, ceasing transmission of the teleconference data to the second computing device.
Clause 20: the system of any of the preceding clauses wherein the computer-executable instructions further cause the one or more processing units to: receiving a control command to cause a transition of the first computing device and the second computing device; and in response to receiving the control command, transmitting the teleconferencing data to the second computing device, causing presentation of the plurality of streams; and transmitting the teleconferencing data to the first computing device, causing presentation of the at least one stream.

Claims (15)

1. A method for a teleconferencing session, comprising:
generating teleconference data including a plurality of streams associated with a teleconference session;
receiving, from a first computing device associated with a first user identity, a first request to join the teleconference session;
transmitting the teleconference data to the first computing device, causing the first computing device to display a main user interface, the main user interface including a presentation of the plurality of streams;
receiving, from a second computing device associated with the first user identity, a second request to join the teleconference session;
in response to receiving the second request from the second computing device to join the teleconference session with the first user identity, selecting at least one stream of the plurality of streams based on an activity level of a participant presented in the at least one stream; and
transmitting the teleconference data to the second computing device, causing the second computing device to display a companion user interface that includes a presentation of the at least one stream selected based on the activity level of the participant presented in the at least one stream, wherein an arrangement of the companion user interface is different from an arrangement of the main user interface.
2. The method of claim 1, wherein the main user interface comprises a plurality of core teleconference control elements including user interface elements configured to control a state of the first computing device and a state of the teleconference session.
3. The method of claim 1, wherein the companion user interface comprises a plurality of companion teleconference control elements including user interface elements configured to provide content sharing functionality, wherein the content sharing functionality enables the second computing device to share at least one of: a selected file, a desktop view, or a stream generated by a camera of the second computing device.
4. The method of claim 1, wherein the activity level represents a level of activity of the at least one flow.
5. The method of claim 1, wherein the plurality of streams includes at least one participant stream and at least one content stream, wherein the method further comprises:
transmitting the at least one participant stream and the at least one content stream to the first computing device; and
transmitting the at least one content stream to the second computing device.
6. The method of claim 5, wherein the at least one participant stream comprises user media data of the teleconference session and the at least one content stream comprises content media data of the teleconference session.
7. A method for a teleconferencing session, comprising:
sending, from a first computing device associated with a first user identity, a first request to join a teleconference session;
receiving, at the first computing device, first teleconferencing data that includes a plurality of streams associated with the teleconferencing session;
displaying, on the first computing device, a presentation of the plurality of streams;
sending, from a second computing device associated with the first user identity, a second request to join the teleconference session;
receiving, at the second computing device, second teleconferencing data that includes a subset of streams of the plurality of streams; and
displaying, on the second computing device, a presentation of the subset of streams of the plurality of streams, wherein the subset of streams of the plurality of streams is selected based at least in part on data indicative of an activity level of a participant presented in at least one stream of the subset of streams, wherein the subset of streams is selected in response to the second request from the second computing device to join the teleconference session with the first user identity, wherein an arrangement of a user interface of the second computing device is different from an arrangement of a user interface of the first computing device.
8. The method of claim 7, wherein the method further comprises:
sending one or more core teleconference control requests from the first computing device to control a state of the teleconference session.
9. The method of claim 7, wherein the method further comprises:
sending one or more companion teleconference control requests from the second computing device to control content sharing functionality of the second computing device.
10. The method of claim 7, wherein the plurality of streams comprises a plurality of participant streams and a plurality of content streams, wherein the plurality of participant streams comprise user media data of the teleconference session and the plurality of content streams comprise content media data of the teleconference session.
11. A method for a teleconferencing session, comprising:
generating teleconference data including a plurality of streams associated with a teleconference session;
receiving, from a first computing device associated with a first user identity, a first request to join the teleconference session;
receiving, from a second computing device associated with the first user identity, a second request to join the teleconference session;
transmitting the teleconferencing data to the first computing device; and
transmitting a sub-portion of the teleconferencing data to the second computing device, wherein the sub-portion comprises at least one stream selected from the plurality of streams based on an activity level of participants presented in the at least one stream, wherein the at least one stream is selected in response to the second request from the second computing device to join the teleconferencing session with the first user identity, wherein an arrangement of a user interface of the second computing device is different from an arrangement of a user interface of the first computing device.
12. The method of claim 11, wherein the method further comprises:
receiving a control command to cause a transition of the first computing device and the second computing device; and
in response to receiving the control command, the control unit,
transmitting the teleconference data to the second computing device to display a presentation of the plurality of streams; and
transmitting the sub-portion of the teleconferencing data to the first computing device to display a presentation of the at least one stream.
13. The method of claim 11, wherein the method further comprises:
receiving a reconfiguration command that causes a transition to the first computing device; and
in response to receiving the reconfiguration command, ceasing transmission of the sub-portion of the teleconference data to the second computing device.
14. The method of claim 11, wherein the method further comprises:
disabling a camera component of the second computing device.
15. The method of claim 11, wherein the method further comprises:
disabling an audio component of the second computing device.
CN201780067116.9A 2016-10-31 2017-10-24 Companion device for real-time collaboration in a teleconference session Active CN109923859B (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US201662415403P 2016-10-31 2016-10-31
US62/415,403 2016-10-31
US15/480,332 US11310294B2 (en) 2016-10-31 2017-04-05 Companion devices for real-time collaboration in communication sessions
US15/480,332 2017-04-05
PCT/US2017/057948 WO2018081030A1 (en) 2016-10-31 2017-10-24 Companion devices for real-time collaboration in teleconference sessions

Publications (2)

Publication Number Publication Date
CN109923859A CN109923859A (en) 2019-06-21
CN109923859B true CN109923859B (en) 2021-07-30

Family

ID=62019994

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201780067327.2A Withdrawn CN109891878A (en) 2016-10-31 2017-10-24 The Enhanced Technology of teleconference session is added
CN201780067116.9A Active CN109923859B (en) 2016-10-31 2017-10-24 Companion device for real-time collaboration in a teleconference session

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN201780067327.2A Withdrawn CN109891878A (en) 2016-10-31 2017-10-24 The Enhanced Technology of teleconference session is added

Country Status (4)

Country Link
US (2) US11310294B2 (en)
EP (2) EP3533222A1 (en)
CN (2) CN109891878A (en)
WO (2) WO2018081030A1 (en)

Families Citing this family (44)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7599983B2 (en) * 2002-06-18 2009-10-06 Wireless Ink Corporation Method, apparatus and system for management of information content for enhanced accessibility over wireless communication networks
EP4365725A2 (en) 2014-05-30 2024-05-08 Apple Inc. Continuity
DK201670622A1 (en) 2016-06-12 2018-02-12 Apple Inc User interfaces for transactions
US10728193B2 (en) * 2017-11-17 2020-07-28 International Business Machines Corporation Receiving and sharing files in a group messaging environment
DK201870364A1 (en) 2018-05-07 2019-12-03 Apple Inc. Multi-participant live communication user interface
CN109005443B (en) * 2018-08-24 2021-05-28 重庆虚拟实境科技有限公司 Real-person remote interaction method for VR-AR all-in-one machine terminal and system based on same
CN109059901B (en) * 2018-09-06 2020-02-11 深圳大学 AR navigation method based on social application, storage medium and mobile terminal
US11128792B2 (en) 2018-09-28 2021-09-21 Apple Inc. Capturing and displaying images with multiple focal planes
US10616151B1 (en) * 2018-10-17 2020-04-07 Asana, Inc. Systems and methods for generating and presenting graphical user interfaces
CN111258528B (en) * 2018-12-03 2021-08-13 华为技术有限公司 Voice user interface display method and conference terminal
US11190568B2 (en) * 2019-01-09 2021-11-30 Bose Corporation Multimedia communication encoding system
US10764442B1 (en) * 2019-03-27 2020-09-01 Lenovo (Singapore) Pte Ltd Muting an audio device participating in a conference call
US11271762B2 (en) * 2019-05-10 2022-03-08 Citrix Systems, Inc. Systems and methods for virtual meetings
CN117170620A (en) 2019-05-31 2023-12-05 苹果公司 User interface for audio media controls
US11010121B2 (en) 2019-05-31 2021-05-18 Apple Inc. User interfaces for audio media control
CN117499725A (en) * 2019-06-17 2024-02-02 谷歌有限责任公司 Method, system and medium for providing dynamic media sessions
US11240470B2 (en) * 2019-07-08 2022-02-01 Nextiva, Inc. Multi-device teleconferences
CN110430384B (en) * 2019-08-23 2020-11-03 珠海格力电器股份有限公司 Video call method and device, intelligent terminal and storage medium
US11256392B2 (en) * 2019-11-01 2022-02-22 Microsoft Technology Licensing, Llc Unified interfaces for paired user computing devices
US11546391B2 (en) 2019-11-01 2023-01-03 Microsoft Technology Licensing, Llc Teleconferencing interfaces and controls for paired user computing devices
US11304246B2 (en) 2019-11-01 2022-04-12 Microsoft Technology Licensing, Llc Proximity-based pairing and operation of user-specific companion devices
US11057702B1 (en) * 2019-12-20 2021-07-06 Microsoft Technology Licensing, Llc Method and system for reducing audio feedback
CN112312058B (en) * 2020-03-22 2023-06-27 北京字节跳动网络技术有限公司 Interaction method and device and electronic equipment
CN111596985B (en) * 2020-04-24 2023-03-14 腾讯科技(深圳)有限公司 Interface display method, device, terminal and medium in multimedia conference scene
CN112311754B (en) * 2020-06-02 2022-09-30 北京字节跳动网络技术有限公司 Interaction method and device and electronic equipment
US20220086212A1 (en) * 2020-09-15 2022-03-17 Carrier Corporation Digital Data Processing
US11392291B2 (en) 2020-09-25 2022-07-19 Apple Inc. Methods and interfaces for media control with dynamic feedback
US11769115B1 (en) 2020-11-23 2023-09-26 Asana, Inc. Systems and methods to provide measures of user workload when generating units of work based on chat sessions between users of a collaboration environment
WO2022140557A1 (en) * 2020-12-22 2022-06-30 Dolby Laboratories Licensing Corporation Acoustic feedback reduction in co-located audioconferencing devices
US11843647B2 (en) * 2021-04-30 2023-12-12 Zoom Video Communications, Inc. Control of dedicated meeting room devices for video communications
US20220360949A1 (en) * 2021-05-05 2022-11-10 Lightspeed Technologies, Inc. Registration and mode switching for devices in a distributed audio system
US11893214B2 (en) 2021-05-15 2024-02-06 Apple Inc. Real-time communication user interface
US11907605B2 (en) * 2021-05-15 2024-02-20 Apple Inc. Shared-content session user interfaces
US20220368548A1 (en) 2021-05-15 2022-11-17 Apple Inc. Shared-content session user interfaces
US11847378B2 (en) 2021-06-06 2023-12-19 Apple Inc. User interfaces for audio routing
US20230064462A1 (en) * 2021-08-24 2023-03-02 International Business Machines Corporation Cognitive core network selection for conference calling
GB2612015A (en) * 2021-09-06 2023-04-26 Laduma Ltd System and method for interactive meeting with both in-room attendees and remote attendees
US11677906B2 (en) 2021-09-10 2023-06-13 Zoom Video Communications, Inc. Secondary mode device software access for primary mode device users
US11770600B2 (en) 2021-09-24 2023-09-26 Apple Inc. Wide angle video conference
CN113872778B (en) * 2021-10-15 2023-07-14 北京百度网讯科技有限公司 Device connection method, device and storage medium
US11775606B2 (en) * 2021-10-29 2023-10-03 Weld North Education LLC Inter-browser presentation control
US20230199041A1 (en) * 2021-12-21 2023-06-22 Nevolane Business Gmbh Remote collaboration platform
WO2023196231A1 (en) * 2022-04-04 2023-10-12 Apple Inc. User interfaces for camera sharing
CN115242566B (en) * 2022-06-28 2023-09-05 深圳乐播科技有限公司 Cloud conference joining method and device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103053150A (en) * 2010-07-26 2013-04-17 思科技术公司 Method for transferring a collaboration session
CN104685864A (en) * 2012-06-25 2015-06-03 英特尔公司 Video conferencing transitions among a plurality of devices
CN105264883A (en) * 2013-05-20 2016-01-20 思杰系统有限公司 Joining an electronic conference in response to sound
CN105474653A (en) * 2013-06-18 2016-04-06 微软技术许可有限责任公司 Unpaired devices
CN105704426A (en) * 2016-03-17 2016-06-22 华为技术有限公司 Method, device and system for video conferencing
US9445054B2 (en) * 2011-09-16 2016-09-13 Ricoh Company, Ltd. Information providing apparatus, transmission system, information providing method, and recording medium storing information providing program

Family Cites Families (84)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100674792B1 (en) 1998-11-24 2007-01-26 텔레폰악티에볼라겟엘엠에릭슨(펍) Mobile telephone auto pc logon
FI110560B (en) * 2000-12-27 2003-02-14 Nokia Corp Grouping of wireless communication terminals
US8934382B2 (en) * 2001-05-10 2015-01-13 Polycom, Inc. Conference endpoint controlling functions of a remote device
US20030048174A1 (en) 2001-09-11 2003-03-13 Alcatel, Societe Anonyme Electronic device capable of wirelessly transmitting a password that can be used to unlock/lock a password protected electronic device
US7191233B2 (en) * 2001-09-17 2007-03-13 Telecommunication Systems, Inc. System for automated, mid-session, user-directed, device-to-device session transfer system
US8081205B2 (en) 2003-10-08 2011-12-20 Cisco Technology, Inc. Dynamically switched and static multiple video streams for a multimedia conference
US7279060B2 (en) 2004-05-04 2007-10-09 Eastman Kodak Company Guarded cover film for LCD polarizers
US7450084B2 (en) 2004-12-17 2008-11-11 Microsoft Corporation System and method for managing computer monitor configurations
US7554571B1 (en) 2005-03-18 2009-06-30 Avaya Inc. Dynamic layout of participants in a multi-party video conference
US20060224882A1 (en) 2005-03-31 2006-10-05 Microsoft Corporation Method and system for unlocking a computing device
US8661540B2 (en) 2005-10-07 2014-02-25 Imation Corp. Method and apparatus for secure credential entry without physical entry
US7394366B2 (en) 2005-11-15 2008-07-01 Mitel Networks Corporation Method of detecting audio/video devices within a room
US20070300165A1 (en) 2006-06-26 2007-12-27 Microsoft Corporation, Corporation In The State Of Washington User interface for sub-conferencing
US8139752B2 (en) 2006-07-28 2012-03-20 Ubiquity Software Corporation Limited Voice conference control from an instant messaging session using an automated agent
US8503651B2 (en) 2006-12-27 2013-08-06 Nokia Corporation Teleconferencing configuration based on proximity information
US7973857B2 (en) 2006-12-27 2011-07-05 Nokia Corporation Teleconference group formation using context information
US8677270B2 (en) 2007-05-04 2014-03-18 Microsoft Corporation Live companion user interface
US20080281971A1 (en) 2007-05-07 2008-11-13 Nokia Corporation Network multimedia communication using multiple devices
US8446454B2 (en) 2007-05-21 2013-05-21 Polycom, Inc. Dynamic adaption of a continuous presence videoconferencing layout based on video content
KR20100007625A (en) 2008-07-14 2010-01-22 엘지전자 주식회사 Mobile terminal and method for displaying menu thereof
US8350891B2 (en) 2009-11-16 2013-01-08 Lifesize Communications, Inc. Determining a videoconference layout based on numbers of participants
US20110119389A1 (en) * 2009-11-19 2011-05-19 Microsoft Corporation Transferring multiple communication modalities during a conversation
US20110271207A1 (en) * 2010-04-30 2011-11-03 American Teleconferencing Services Ltd. Location-Aware Conferencing
CN103039072A (en) 2010-05-25 2013-04-10 维德约股份有限公司 Systems and methods for scalable video communication using multiple cameras and multiple monitors
US8379077B2 (en) 2010-11-24 2013-02-19 Cisco Technology, Inc. Automatic layout and speaker selection in a continuous presence video conference
US8647202B2 (en) 2010-12-16 2014-02-11 Microsoft Corporation Companion object customization
US20120182384A1 (en) 2011-01-17 2012-07-19 Anderson Eric C System and method for interactive video conferencing
CN102625304B (en) 2011-01-27 2016-01-20 腾讯科技(深圳)有限公司 Failure mobile terminal associated application remembers system, the device and method of password
WO2012151290A1 (en) 2011-05-02 2012-11-08 Apigy Inc. Systems and methods for controlling a locking mechanism using a portable electronic device
US8918722B2 (en) 2011-05-06 2014-12-23 David H. Sitrick System and methodology for collaboration in groups with split screen displays
US9369673B2 (en) 2011-05-11 2016-06-14 Blue Jeans Network Methods and systems for using a mobile device to join a video conference endpoint into a video conference
US9565708B2 (en) 2011-05-20 2017-02-07 Microsoft Technology Licensing, Llc Auto-connect in a peer-to-peer network
US20160342784A1 (en) 2011-07-15 2016-11-24 Vmware, Inc. Mobile device authentication
US9800688B2 (en) 2011-09-12 2017-10-24 Microsoft Technology Licensing, Llc Platform-enabled proximity service
US8896651B2 (en) 2011-10-27 2014-11-25 Polycom, Inc. Portable devices as videoconferencing peripherals
US9021557B2 (en) 2011-10-27 2015-04-28 Stmicroelectronics Pte Ltd System and method for security using a sibling smart card
US9131333B2 (en) 2011-12-30 2015-09-08 Linkedin Corporation Systems and methods for mobile device pairing
US20130219288A1 (en) 2012-02-20 2013-08-22 Jonathan Rosenberg Transferring of Communication Event
US8786517B2 (en) 2012-02-21 2014-07-22 Blackberry Limited System and method for displaying a user interface across multiple electronic devices
US20130219303A1 (en) 2012-02-21 2013-08-22 Research In Motion Tat Ab Method, apparatus, and system for providing a shared user interface
US20140006495A1 (en) 2012-06-29 2014-01-02 International Business Machines Corporation Authorizing access to a web conference for a specified duration of time
US20140026070A1 (en) 2012-07-17 2014-01-23 Microsoft Corporation Dynamic focus for conversation visualization environments
US20140028726A1 (en) 2012-07-30 2014-01-30 Nvidia Corporation Wireless data transfer based spanning, extending and/or cloning of display data across a plurality of computing devices
US10061829B2 (en) 2012-08-10 2018-08-28 Nec Corporation Method and system for providing content for user devices
US9026053B2 (en) 2013-02-17 2015-05-05 Fitbit, Inc. System and method for wireless device pairing
US9369672B2 (en) 2013-03-14 2016-06-14 Polycom, Inc. Intelligent layouts for call scaling and layout persistence
WO2014142960A1 (en) 2013-03-15 2014-09-18 Intel Corporation Mechanisms for locking computing devices
US9137723B2 (en) 2013-03-15 2015-09-15 Facebook, Inc. Portable platform for networked computing
US20140315489A1 (en) 2013-04-22 2014-10-23 Htc Corporation Method for performing wireless display sharing, and associated apparatus and associated computer program product
US10243786B2 (en) 2013-05-20 2019-03-26 Citrix Systems, Inc. Proximity and context aware mobile workspaces in enterprise systems
CN104469256B (en) * 2013-09-22 2019-04-23 思科技术公司 Immersion and interactive video conference room environment
US9088694B2 (en) 2013-10-03 2015-07-21 Avaya, Inc. Adjusting video layout
GB201319687D0 (en) 2013-11-07 2013-12-25 Microsoft Corp Call handling
US9391982B1 (en) 2014-02-27 2016-07-12 Cullen/Frost Bankers, Inc. Network authentication of multiple profile accesses from a single remote device
US20150271273A1 (en) 2014-03-18 2015-09-24 CafeX Communications Inc. System for Using a Device as a Side Car
US10055567B2 (en) 2014-05-30 2018-08-21 Apple Inc. Proximity unlock and lock operations for electronic devices
WO2015187941A1 (en) 2014-06-05 2015-12-10 Reel, Inc. Apparatus and method for sharing content items among a plurality of mobile devices
US9485414B2 (en) 2014-06-20 2016-11-01 John Visosky Eye contact enabling device for video conferencing
US9602771B2 (en) 2014-12-10 2017-03-21 Polycom, Inc. Automated layouts optimized for multi-screen and multi-camera videoconferencing calls
US9438643B2 (en) * 2015-01-15 2016-09-06 Cisco Technology, Inc. Multi-device conference participation
US10440757B2 (en) 2015-02-17 2019-10-08 Google Llc Second-screen control automatic pairing using push notifications
US10701120B2 (en) 2015-03-06 2020-06-30 Disney Enterprises, Inc. Proximity based entitlement sharing
US9819902B2 (en) 2015-03-19 2017-11-14 Microsoft Technology Licensing, Llc Proximate resource pooling in video/audio telecommunications
US9848075B1 (en) 2015-05-14 2017-12-19 Invoy Technologies, Llc Communication system for pairing user devices with medical devices
US9913079B2 (en) 2015-06-05 2018-03-06 Apple Inc. Cloud-based proximity pairing and switching for peer-to-peer devices
US9801219B2 (en) 2015-06-15 2017-10-24 Microsoft Technology Licensing, Llc Pairing of nearby devices using a synchronized cue signal
US11106417B2 (en) 2015-06-23 2021-08-31 Airwatch, Llc Collaboration systems with managed screen sharing
US9602956B1 (en) 2015-08-25 2017-03-21 Yahoo! Inc. System and method for device positioning with bluetooth low energy distributions
US10031577B2 (en) 2015-10-05 2018-07-24 International Business Machines Corporation Gaze-aware control of multi-screen experience
US10244057B2 (en) 2015-10-09 2019-03-26 Adobe Systems Incorporated Techniques for associating and sharing data from multiple local devices
US10334076B2 (en) 2016-02-22 2019-06-25 Google Llc Device pairing in augmented/virtual reality environment
US10999331B1 (en) 2016-07-06 2021-05-04 Google Llc Reverse discovery and pairing of client devices to a media device
US20180267774A1 (en) 2017-03-16 2018-09-20 Cisco Technology, Inc. Conference assistant device with configurable user interfaces based on operational state
US10838681B2 (en) 2017-04-05 2020-11-17 Panasonic Avionics Corporation Screen mirroring from personal electronic devices to transportation vehicle display screens
KR20200128057A (en) 2018-03-02 2020-11-11 닛토덴코 가부시키가이샤 Device pairing system and method and device communication control system and method
KR20230017925A (en) 2018-04-02 2023-02-06 구글 엘엘씨 Methods, devices, and systems for interactive cloud gaming
US10824384B2 (en) 2018-04-30 2020-11-03 Dell Products L.P. Controller for providing sharing between visual devices
DK201870364A1 (en) 2018-05-07 2019-12-03 Apple Inc. Multi-participant live communication user interface
US11095659B2 (en) 2018-05-30 2021-08-17 Cisco Technology, Inc. Personalized services based on confirmed proximity of user
US10841174B1 (en) 2018-08-06 2020-11-17 Apple Inc. Electronic device with intuitive control interface
US20210004454A1 (en) 2019-07-07 2021-01-07 Apple Inc. Proof of affinity to a secure event for frictionless credential management
US11304246B2 (en) 2019-11-01 2022-04-12 Microsoft Technology Licensing, Llc Proximity-based pairing and operation of user-specific companion devices
US11546391B2 (en) 2019-11-01 2023-01-03 Microsoft Technology Licensing, Llc Teleconferencing interfaces and controls for paired user computing devices
US11256392B2 (en) 2019-11-01 2022-02-22 Microsoft Technology Licensing, Llc Unified interfaces for paired user computing devices

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103053150A (en) * 2010-07-26 2013-04-17 思科技术公司 Method for transferring a collaboration session
US9445054B2 (en) * 2011-09-16 2016-09-13 Ricoh Company, Ltd. Information providing apparatus, transmission system, information providing method, and recording medium storing information providing program
CN104685864A (en) * 2012-06-25 2015-06-03 英特尔公司 Video conferencing transitions among a plurality of devices
CN105264883A (en) * 2013-05-20 2016-01-20 思杰系统有限公司 Joining an electronic conference in response to sound
CN105474653A (en) * 2013-06-18 2016-04-06 微软技术许可有限责任公司 Unpaired devices
CN105704426A (en) * 2016-03-17 2016-06-22 华为技术有限公司 Method, device and system for video conferencing

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
三网融合下的新业务发展趋势;杨玉佳;《电信科学》;20110331;第17-21页 *

Also Published As

Publication number Publication date
EP3533222A1 (en) 2019-09-04
US20180124128A1 (en) 2018-05-03
US11310294B2 (en) 2022-04-19
CN109891878A (en) 2019-06-14
EP3533221A1 (en) 2019-09-04
WO2018081029A1 (en) 2018-05-03
US11212326B2 (en) 2021-12-28
US20180124136A1 (en) 2018-05-03
CN109923859A (en) 2019-06-21
WO2018081030A1 (en) 2018-05-03

Similar Documents

Publication Publication Date Title
CN109923859B (en) Companion device for real-time collaboration in a teleconference session
US10863136B2 (en) Switch view functions for teleconference sessions
CN109891827B (en) Integrated multi-tasking interface for telecommunications sessions
US10863227B2 (en) Event conditioned views for communication sessions
US10334208B2 (en) Technologies for following participants in a video conference
US10509964B2 (en) Toggle view functions for teleconferencing sessions
EP3881170B1 (en) Interactive viewing system
US20180232920A1 (en) Contextually aware location selections for teleconference monitor views
US20180295158A1 (en) Displaying group expressions for teleconference sessions
US9924136B1 (en) Coordinated display transitions of people and content
US20200201512A1 (en) Interactive editing system
US11785194B2 (en) Contextually-aware control of a user interface displaying a video and related user text
EP3262581A1 (en) Opening new application window in response to remote resource sharing
US10942633B2 (en) Interactive viewing and editing system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant