WO2022072482A1 - Video teleconferencing - Google Patents

Video teleconferencing Download PDF

Info

Publication number
WO2022072482A1
WO2022072482A1 PCT/US2021/052632 US2021052632W WO2022072482A1 WO 2022072482 A1 WO2022072482 A1 WO 2022072482A1 US 2021052632 W US2021052632 W US 2021052632W WO 2022072482 A1 WO2022072482 A1 WO 2022072482A1
Authority
WO
WIPO (PCT)
Prior art keywords
user input
video
response
video conferencing
outgoing
Prior art date
Application number
PCT/US2021/052632
Other languages
French (fr)
Inventor
Sara Beth Ulius-Sabel
Lee M. Jones
Lisa Lyman
Matthew C Smith
Martin R Bodley
Timothy J. Meredith
Kevin Ernst
Original Assignee
Bose Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bose Corporation filed Critical Bose Corporation
Publication of WO2022072482A1 publication Critical patent/WO2022072482A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • H04N7/147Communication arrangements, e.g. identifying the communication as a video-communication, intermediate storage of the signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working

Definitions

  • Systems and methods disclosed herein are directed to video conferencing control systems, methods, and applications that control or adjust the local components of a video teleconference station.
  • Systems, methods, and computer-readable instructions are provided that at least partially control a video conferencing station.
  • the systems, methods, and instructions receive a user input, alter an outgoing audio associated with the video conferencing station in response to the user input, and alter an outgoing video associated with the video conferencing station in response to the user input.
  • Some examples also reduce an output volume of a loudspeaker associated with the video conferencing station in response to the user input.
  • Various examples may also post a message via the video conferencing station in response to the user input.
  • Certain examples reduce an output volume of a loudspeaker associated with the video conferencing station in response to the user input and post a message via the video conferencing station in response to the user input.
  • a user-defined function may be executed in response to the user input.
  • altering the outgoing video comprises disabling the camera.
  • altering the outgoing video comprises at least one of blurring a background, replacing the background, or replacing an entire view of the outgoing video.
  • altering the outgoing audio comprises muting a microphone.
  • FIG. 1 illustrates an example system 100 that provides a control input to a video conferencing system
  • FIG. 2 illustrates another example system 200 that provides a control input to a video conferencing system. DESCRIPTION
  • aspects of the present disclosure are directed to systems and methods for applying quick control of video conferencing gear and features such as microphone(s), camera(s), loudspeaker(s), messaging, backgrounds, etc.
  • aspects and examples of systems and methods disclosed herein include physical user interface controls that could be on a remote control or conferencing accessory product that would provide easy, one-touch access to frequently used remote or video conferencing meeting controls and dedicated button(s) or other sensors to perform multiple meeting management actions at once.
  • the controls, buttons, sensors, and the like may be “virtual” such as a “button” on a screen that may be selected by a user by “clicking” on it with a pointer or cursor controlled by a user input device, such as a computer mouse, touchscreen, or the like.
  • FIG. 1 illustrates a system 100 that includes a docking capability to hold a personal portable device, such as a smart phone or tablet, to thereby act as, or to otherwise be coupled to, a video conferencing station.
  • FIG. 2 illustrates another system 200 in the form of an accessory that couples to a video conferencing station, which may be any computing device having software to perform the functions of a video conferencing station, such as a personal computer, tablet, smart phone, etc.
  • the system 200 of FIG. 2 is shown in a “puck” form factor but could be in any of numerous physical form factors, including, for example without limitation, handheld form factors, desktop form factors, etc.
  • buttons and/or sensors such as touch / capacitive sensors, configured to control various functions like volume up/down, mute, answer, hang-up, camera on/off, background effect(s) on/off, etc.
  • the buttons may be dedicated to their respective configured functionality.
  • the buttons may be assignable to various functions, which may be user-assigned in various examples.
  • Each of the system 100 and the system 200 includes a multifunction button 110 that may be coupled to a processor and configured to control or change numerous features and/or functions of the video conferencing station.
  • the system 100, 200 is configured to respond to user activation of the button 110 by simultaneously (or nearly simultaneously) altering the operation of multiple functions of the video conferencing station at once, e.g., when a user needs to turn off audio and video in a hurry, such as may be desirable if something happens in the background that other participants should not see or hear.
  • the system 100, 200 is configured to respond to the button 110 by muting a microphone, turning off a camera, and displaying a message, such as “be right back,” into a chat feature and/or as a background or video feed to the video conferencing system.
  • the message may be a status indicator, or may be a response triggered by activity of other participant(s) in the video conference.
  • the message may be user defined or user selectable in various examples.
  • Such a functionality may be described as a “bio-break” feature that allows a participant (a user of the video conferencing station) to easily excuse themselves from the video conference for a brief time.
  • a second activation of the button 110 may return each of the camera, microphone, loudspeaker, etc. to its prior state.
  • a time delay may be enforced by the system 100, 200 to ensure that an erroneous second activation doesn’t trigger the return to prior operating state.
  • activation of the button 110 may be configured to provide effortless shortcuts, which may be configurable by a user.
  • each of the system 100 and the system 200 may be configured to allow a user to program multiple actions or functions to be executed upon activation of the button 110, such as a macro programming to be executed.
  • the button 110 is configured to be available for discrete activation, e.g., without needing a user to make large physical movements to do so.
  • the button 110 may be of a size, location, and labelling to enable quick and easy acquisition (visual sighting) and physical activation by the user, in a manner that minimizes the possibility of other participants noticing the activation in a video feed.
  • system 100, 200 may be configured to respond to the button 110 by obscuring the background environment of the user, such as by activating a background blurring feature of the video conferencing station or activating a background image in lieu of a true background provided in a video feed from the video camera.
  • the button 110 may be of any suitable physical or virtual type.
  • the button 110 may be a physical button or switch or may be another type of user input sensor, such as a capacitive touch sensor, an optical sensor, a gesture control, etc.
  • the button 110 may be a virtual button, such as an active location on a screen that may be selected by a user, such as by a mouse, touchscreen, or other computer input device.
  • the button 110 may take the form of an icon on a display screen capable of being selected or activated by a user.
  • Example systems 100, 200 and the like having a button 110 configured to activate or alter numerous functions with a single activation, whether pre-configured or user-definable, save time, mental effort, and potential embarrassment or social consequences by making it easier to locate and execute video conferencing meeting management functions.
  • references to “or” may be construed as inclusive so that any terms described using “or” may indicate any of a single, more than one, and all of the described terms. Any references to front and back, left and right, top and bottom, upper and lower, and vertical and horizontal are intended for convenience of description, not to limit the present systems and methods or their components to any one positional or spatial orientation, unless the context reasonably implies otherwise.

Abstract

Systems, methods, and computer instructions are provided for controlling a video teleconference station. A single user interaction is received as an activating user input and in response thereto each of an outgoing audio signal and outgoing video signal are altered. Each of a camera and a microphone may be disabled or muted, for example. Additional functionality may be associated with the user input, which may be a button or other sensor in various examples.

Description

VIDEO TELECONFERENCING
CROSS REFERENCE TO RELATED APPLICATIONS
This application claims the benefit of U.S. Patent Application Serial Number 63/086,354, filed on October 1, 2020, titled “VIDEO TELECONFERENCING” the content of which is incorporated herein in its entirety for all purposes.
BACKGROUND
With the increase in working from home, such as due to the Covid- 19 pandemic, video conferencing usage has increased substantially. People are finding that at home distractions such as barking dogs, screaming and playing kids, spouses, partners, or roommates walking around in the background, or ringing doorbells, and the like, make it necessary to quickly and easily assert control over aspects of participating in a video conference, such as by controlling the microphone, speaker, and video camera to avoid the social consequences of having undesirable or distracting activities seen or heard by others on the video conference call.
SUMMARY
Systems and methods disclosed herein are directed to video conferencing control systems, methods, and applications that control or adjust the local components of a video teleconference station.
Systems, methods, and computer-readable instructions are provided that at least partially control a video conferencing station. The systems, methods, and instructions receive a user input, alter an outgoing audio associated with the video conferencing station in response to the user input, and alter an outgoing video associated with the video conferencing station in response to the user input.
Some examples also reduce an output volume of a loudspeaker associated with the video conferencing station in response to the user input.
Various examples may also post a message via the video conferencing station in response to the user input.
Certain examples reduce an output volume of a loudspeaker associated with the video conferencing station in response to the user input and post a message via the video conferencing station in response to the user input. In some examples, a user-defined function may be executed in response to the user input.
According to various examples, altering the outgoing video comprises disabling the camera.
In certain examples altering the outgoing video comprises at least one of blurring a background, replacing the background, or replacing an entire view of the outgoing video.
According to some examples, altering the outgoing audio comprises muting a microphone.
Still other aspects, examples, and advantages of these exemplary aspects and examples are discussed in detail below. Examples disclosed herein may be combined with other examples in any manner consistent with at least one of the principles disclosed herein, and references to “an example,” “some examples,” “an alternate example,” “various examples,” “one example” or the like are not necessarily mutually exclusive and are intended to indicate that a particular feature, structure, or characteristic described may be included in at least one example. The appearances of such terms herein are not necessarily all referring to the same example.
BRIEF DESCRIPTION OF THE DRAWINGS
Various aspects of at least one example are discussed below with reference to the accompanying figures, which are not intended to be drawn to scale. The figures are included to provide illustration and a further understanding of the various aspects and examples and are incorporated in and constitute a part of this specification but are not intended as a definition of the limits of the invention(s). In the figures, identical or nearly identical components illustrated in various figures may be represented by a like reference character or numeral. For purposes of clarity, not every component may be labeled in every figure. In the figures:
FIG. 1 illustrates an example system 100 that provides a control input to a video conferencing system; and
FIG. 2 illustrates another example system 200 that provides a control input to a video conferencing system. DESCRIPTION
Aspects of the present disclosure are directed to systems and methods for applying quick control of video conferencing gear and features such as microphone(s), camera(s), loudspeaker(s), messaging, backgrounds, etc.
Aspects and examples of systems and methods disclosed herein include physical user interface controls that could be on a remote control or conferencing accessory product that would provide easy, one-touch access to frequently used remote or video conferencing meeting controls and dedicated button(s) or other sensors to perform multiple meeting management actions at once. In various examples, the controls, buttons, sensors, and the like may be “virtual” such as a “button” on a screen that may be selected by a user by “clicking” on it with a pointer or cursor controlled by a user input device, such as a computer mouse, touchscreen, or the like.
FIG. 1 illustrates a system 100 that includes a docking capability to hold a personal portable device, such as a smart phone or tablet, to thereby act as, or to otherwise be coupled to, a video conferencing station. FIG. 2 illustrates another system 200 in the form of an accessory that couples to a video conferencing station, which may be any computing device having software to perform the functions of a video conferencing station, such as a personal computer, tablet, smart phone, etc. The system 200 of FIG. 2 is shown in a “puck” form factor but could be in any of numerous physical form factors, including, for example without limitation, handheld form factors, desktop form factors, etc. Each of the system 100 and the system 200 includes various buttons and/or sensors, such as touch / capacitive sensors, configured to control various functions like volume up/down, mute, answer, hang-up, camera on/off, background effect(s) on/off, etc. In various examples, the buttons may be dedicated to their respective configured functionality. In other examples, the buttons may be assignable to various functions, which may be user-assigned in various examples.
Each of the system 100 and the system 200 includes a multifunction button 110 that may be coupled to a processor and configured to control or change numerous features and/or functions of the video conferencing station. In certain examples, the system 100, 200 is configured to respond to user activation of the button 110 by simultaneously (or nearly simultaneously) altering the operation of multiple functions of the video conferencing station at once, e.g., when a user needs to turn off audio and video in a hurry, such as may be desirable if something happens in the background that other participants should not see or hear. In various examples, the system 100, 200 is configured to respond to the button 110 by muting a microphone, turning off a camera, and displaying a message, such as “be right back,” into a chat feature and/or as a background or video feed to the video conferencing system. In some examples, the message may be a status indicator, or may be a response triggered by activity of other participant(s) in the video conference. The message may be user defined or user selectable in various examples. Such a functionality may be described as a “bio-break” feature that allows a participant (a user of the video conferencing station) to easily excuse themselves from the video conference for a brief time. A second activation of the button 110 may return each of the camera, microphone, loudspeaker, etc. to its prior state. In some examples, a time delay may be enforced by the system 100, 200 to ensure that an erroneous second activation doesn’t trigger the return to prior operating state.
In some examples, activation of the button 110 may be configured to provide effortless shortcuts, which may be configurable by a user. In some examples, each of the system 100 and the system 200 may be configured to allow a user to program multiple actions or functions to be executed upon activation of the button 110, such as a macro programming to be executed.
In various examples, the button 110 is configured to be available for discrete activation, e.g., without needing a user to make large physical movements to do so. For instance, the button 110 may be of a size, location, and labelling to enable quick and easy acquisition (visual sighting) and physical activation by the user, in a manner that minimizes the possibility of other participants noticing the activation in a video feed.
In various examples, the system 100, 200 may be configured to respond to the button 110 by obscuring the background environment of the user, such as by activating a background blurring feature of the video conferencing station or activating a background image in lieu of a true background provided in a video feed from the video camera.
The button 110 may be of any suitable physical or virtual type. For example, the button 110 may be a physical button or switch or may be another type of user input sensor, such as a capacitive touch sensor, an optical sensor, a gesture control, etc. In some instances the button 110 may be a virtual button, such as an active location on a screen that may be selected by a user, such as by a mouse, touchscreen, or other computer input device. Accordingly, the button 110 may take the form of an icon on a display screen capable of being selected or activated by a user. Example systems 100, 200 and the like having a button 110 configured to activate or alter numerous functions with a single activation, whether pre-configured or user-definable, save time, mental effort, and potential embarrassment or social consequences by making it easier to locate and execute video conferencing meeting management functions.
Examples of the methods and apparatuses discussed herein are not limited in application to the details of construction and the arrangement of components set forth in the above descriptions or illustrated in the accompanying drawings. The methods and apparatuses are capable of implementation in other examples and of being practiced or of being carried out in various ways. Examples of specific implementations are provided herein for illustrative purposes only and are not intended to be limiting. In particular, functions, components, elements, and features discussed in connection with any one or more examples are not intended to be excluded from a similar role in any other examples.
Also, the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting. Any references to examples, components, elements, acts, or functions of the systems and methods herein referred to in the singular may also embrace embodiments including a plurality, and any references in plural to any example, component, element, act, or function herein may also embrace examples including only a singularity. Accordingly, references in the singular or plural form are not intended to limit the presently disclosed systems or methods, their components, acts, or elements. The use herein of “including,” “comprising,” “having,” “containing,” “involving,” and variations thereof is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. References to “or” may be construed as inclusive so that any terms described using “or” may indicate any of a single, more than one, and all of the described terms. Any references to front and back, left and right, top and bottom, upper and lower, and vertical and horizontal are intended for convenience of description, not to limit the present systems and methods or their components to any one positional or spatial orientation, unless the context reasonably implies otherwise.
Having described above several aspects of at least one example, it is to be appreciated various alterations, modifications, and improvements will readily occur to those skilled in the art. Such alterations, modifications, and improvements are intended to be part of this disclosure and are intended to be within the scope of the invention. Accordingly, the foregoing description and drawings are by way of example only, and the scope of the invention should be determined from proper construction of the appended claims, and their equivalents.

Claims

CLAIMS What is claimed is:
1. A method of controlling a video conferencing station, the method comprising: receiving an indication of an activation of a user input; altering an outgoing audio associated with the video conferencing station in response to the user input; and altering an outgoing video associated with the video conferencing station in response to the user input.
2. The method of claim 1 further comprising reducing an output volume of a loudspeaker associated with the video conferencing station in response to the user input.
3. The method of claim 1 further comprising posting a message via the video conferencing station in response to the user input.
4. The method of claim 1 further comprising: reducing an output volume of a loudspeaker associated with the video conferencing station in response to the user input; and posting a message via the video conferencing station in response to the user input.
5. The method of claim 1 further comprising executing a user-defined function in response to the user input.
6. The method of claim 1 wherein altering the outgoing video comprises disabling the camera.
7. The method of claim 1 wherein altering the outgoing video comprises at least one of blurring a background, replacing the background, or replacing an entire view of the outgoing video.
6
8. The method of claim 1 wherein altering the outgoing audio comprises muting a microphone.
9. An apparatus comprising: an enclosure; a processor within the enclosure; and a user input sensor coupled to the processor, wherein the processor is configured to detect activation of the user input sensor and to control a video conferencing station to alter an outgoing audio and alter an outgoing video in response to the detected activation of the user input sensor.
10. The apparatus of claim 9 wherein the processor is further configured to control the video conferencing station to reduce an output volume of a loudspeaker.
11. The apparatus of claim 9 wherein the processor is further configured to control the video conferencing station to post a message viewable by remote participants.
12. The apparatus of claim 9 wherein the processor is further configured to control the video conferencing station to reduce an output volume of a loudspeaker and to post a message viewable by remote participants.
13. The apparatus of claim 9 wherein the processor is further configured to execute a user-defined function.
14. The apparatus of claim 9 wherein altering the outgoing video comprises disabling the camera.
15. The apparatus of claim 9 wherein altering the outgoing video comprises at least one of blurring a background, replacing the background, or replacing an entire view of the outgoing video.
16. The apparatus of claim 9 wherein altering the outgoing audio comprises muting a microphone.
7
17. A non-transitory machine readable storage medium having instructions encoded thereon that, when executed by a processor, cause the processor to perform a method comprising: receiving an indication of an activation of a user input; altering an outgoing audio associated with the video conferencing station in response to the user input; and altering an outgoing video associated with the video conferencing station in response to the user input.
18. The non-transitory storage medium of claim 17 further comprising instructions to cause the processor to reduce an output volume of a loudspeaker associated with the video conferencing station in response to the user input.
19. The non-transitory storage medium of claim 17 further comprising instructions to cause the processor to post a message via the video conferencing station in response to the user input.
20. The non-transitory storage medium of claim 17 wherein the method further comprises: reducing an output volume of a loudspeaker associated with the video conferencing station in response to the user input; and posting a message via the video conferencing station in response to the user input.
21. The non-transitory storage medium of claim 17 further comprising instructions to cause the processor to execute a user-defined function in response to the user input.
22. The non-transitory storage medium of claim 17 wherein altering the outgoing video comprises disabling the camera.
23. The non-transitory storage medium of claim 17 wherein altering the outgoing video comprises at least one of blurring a background, replacing the background, or replacing an entire view of the outgoing video.
8
24. The non-transitory storage medium of claim 17 wherein altering the outgoing audio comprises muting a microphone.
9
PCT/US2021/052632 2020-10-01 2021-09-29 Video teleconferencing WO2022072482A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202063086354P 2020-10-01 2020-10-01
US63/086,354 2020-10-01

Publications (1)

Publication Number Publication Date
WO2022072482A1 true WO2022072482A1 (en) 2022-04-07

Family

ID=78592908

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2021/052632 WO2022072482A1 (en) 2020-10-01 2021-09-29 Video teleconferencing

Country Status (1)

Country Link
WO (1) WO2022072482A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120154509A1 (en) * 2010-12-16 2012-06-21 Mitel Networks Corporation Method and system for audio-video communications
US20130002800A1 (en) * 2011-06-28 2013-01-03 Mock Wayne E Muting a Videoconference Using Touch-Based Gestures
EP2797279B1 (en) * 2013-04-23 2016-09-14 Gurulogic Microsystems OY Communications using at least two different media types
US20170374118A1 (en) * 2016-06-23 2017-12-28 Ringcentral, Inc. Conferencing system and method implementing video quasi-muting
US20200028883A1 (en) * 2018-07-18 2020-01-23 International Business Machines Corporation Enhanced teleconferencing using noise filtering, amplification, and selective muting

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120154509A1 (en) * 2010-12-16 2012-06-21 Mitel Networks Corporation Method and system for audio-video communications
US20130002800A1 (en) * 2011-06-28 2013-01-03 Mock Wayne E Muting a Videoconference Using Touch-Based Gestures
EP2797279B1 (en) * 2013-04-23 2016-09-14 Gurulogic Microsystems OY Communications using at least two different media types
US20170374118A1 (en) * 2016-06-23 2017-12-28 Ringcentral, Inc. Conferencing system and method implementing video quasi-muting
US20200028883A1 (en) * 2018-07-18 2020-01-23 International Business Machines Corporation Enhanced teleconferencing using noise filtering, amplification, and selective muting

Similar Documents

Publication Publication Date Title
US11586340B2 (en) Terminal and method for setting menu environments in the terminal
CN110944083B (en) Method, system, and medium for do-not-disturb mode
WO2016206201A1 (en) Mobile terminal, and display control method and apparatus
US8090087B2 (en) Method, system, and graphical user interface for making conference calls
US9363476B2 (en) Configuration of a touch screen display with conferencing
CN106371688A (en) Full-screen single-hand operation method and apparatus
US9953101B1 (en) Customized home screens for electronic devices
KR20130115174A (en) Apparatus and method for providing a digital bezel
JP6134870B2 (en) Description Information Display Method, Description Information Display Device, Electronic Device, Program, and Recording Medium
JP2017527928A (en) Text input method, apparatus, program, and recording medium
JP2017511947A (en) Button interaction method, apparatus, program, and recording medium
US20200285381A1 (en) User interface for use in computing device with sensitive display
EP3430802A1 (en) Selectable interaction elements in a 360-degree video stream
US20180285051A1 (en) System for controlling a display device
CN110187947A (en) A kind of message display method and terminal device
CN111610912B (en) Application display method, application display device and storage medium
CN106095285B (en) Operation execution method and device
US20230319413A1 (en) User interfaces for camera sharing
CN112764710A (en) Audio playing mode switching method and device, electronic equipment and storage medium
US9532198B2 (en) System and method for initiating communication from actual, notional, or dissociated previewed images
EP2898492B1 (en) Handheld information processing device with remote control output mode
WO2022072482A1 (en) Video teleconferencing
WO2023115777A1 (en) Cursor control method and apparatus, electronic device, and storage medium
EP4270938A1 (en) Video call method and device
CN112087643B (en) Information processing method and device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21806060

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21806060

Country of ref document: EP

Kind code of ref document: A1