US11910130B2 - Media control device and system - Google Patents

Media control device and system Download PDF

Info

Publication number
US11910130B2
US11910130B2 US17/683,516 US202217683516A US11910130B2 US 11910130 B2 US11910130 B2 US 11910130B2 US 202217683516 A US202217683516 A US 202217683516A US 11910130 B2 US11910130 B2 US 11910130B2
Authority
US
United States
Prior art keywords
video conferencing
conferencing application
media control
computing device
input
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US17/683,516
Other versions
US20220286646A1 (en
Inventor
Jinmo Rhee
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Carnegie Mellon University
Original Assignee
Carnegie Mellon University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Carnegie Mellon University filed Critical Carnegie Mellon University
Priority to US17/683,516 priority Critical patent/US11910130B2/en
Assigned to CARNEGIE MELLON UNIVERSITY reassignment CARNEGIE MELLON UNIVERSITY ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: RHEE, JINMO
Publication of US20220286646A1 publication Critical patent/US20220286646A1/en
Application granted granted Critical
Publication of US11910130B2 publication Critical patent/US11910130B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • H04N7/142Constructional details of the terminal equipment, e.g. arrangements of the camera and the display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • H04N21/42206User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
    • H04N21/4221Dedicated function buttons, e.g. for the control of an EPG, subtitles, aspect ratio, picture-in-picture or teletext
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting

Definitions

  • This disclosure relates generally to a media control device and medial control system.
  • Video conferencing applications such as Zoom®, Skype®, Teams®, BlueJeans®, and/or the like, are used by individuals to conduct virtual meetings and are controlled through software settings specific to each application or platform. For example, a user that is speaking or otherwise presenting via one of these applications may have to locate their mouse, search through options, and/or otherwise provide inputs that disrupt the flow of the presentation and distract the presenter and audience.
  • a media control device system comprising: (a) a media control device comprising: a user interface configured to receive user input; and a communication interface configured to establish communication with a separate computing device; and (b) a computer-readable non-transitory medium including program instructions that, when executed by a processor of the separate computing device, cause the separate computing device to: detect a video conferencing application being executed by the separate computing device; in response to detecting the video conferencing application, determining an input configuration of the video conferencing application; receive a plurality of signals from the media control device via the communication interface, the plurality of signals representing a plurality of different user inputs; map the plurality of signals to a plurality of video conferencing application input signals based on the input configuration; and control the video conferencing application based on the video conferencing application input signals.
  • the media control device further comprises a housing, and the user interface comprises a plurality of buttons arranged on the housing. In non-limiting embodiments or aspects, each button of the plurality of buttons is associated with a different function of the video conferencing application.
  • the program instructions are part of a device driver executing on the separate computing device.
  • the video conferencing application is controlled based on the at least one video conferencing application input signal by emulating, by the processor, the at least one video conferencing application input signal. In non-limiting embodiments or aspects, emulating the at least one video conferencing application input signal comprises generating at least one operating system-level command.
  • a media control device comprising: a housing; a user interface configured to receive user input arranged on the housing; a communication interface configured to establish communication with a separate computing device; and a controller arranged in the housing and in communication with the user interface and the communication interface, the controller configured to: generate signals based on the user input, and control a video conferencing application executing on the separate computing device with the signals.
  • the user interface comprises a plurality of buttons.
  • each button of the plurality of buttons is associated with a different function of the video conferencing application.
  • the controller generates the signals by converting the user input to keyboard inputs recognized by the video conferencing application.
  • a media control method comprising: detecting, with a computing device, a video conferencing application being executed by the computing device; in response to detecting the video conferencing application, determining an input configuration of the video conferencing application; receiving at least one signal from a media control device via a communication interface of the media control device, the at least one signal representing a user input on the media control device of a plurality of possible user inputs, the media control device is separate from the computing device; mapping the at least one signal to at least one video conferencing application input signal based on the input configuration; and controlling the video conferencing application based on the at least one video conferencing application input signal.
  • the media control device comprises a housing and a user interface, the user interface comprising a plurality of buttons arranged on the housing.
  • each button of the plurality of buttons produces a different signal when actuated, the method further including mapping each different signal to a different video conferencing application input signal based on the input configuration.
  • each button of the plurality of buttons is associated with a different function of the video conferencing application.
  • a device driver installed on the computing device maps the at least one signal to the at least one video conferencing application input signal and controls the video conferencing application.
  • controlling the video conferencing application based on the at least one video conferencing application input signal comprises emulating, with the computing device, the at least one video conferencing application input signal.
  • emulating the at least one video conferencing application input signal comprises generating at least one operating system-level command.
  • FIG. 1 is a schematic diagram of a media control system according to a non-limiting embodiment
  • FIG. 2 is an exploded view of a media control device according to a non-limiting embodiment
  • FIGS. 3 A- 3 F show a media control device according to non-limiting embodiments
  • FIG. 4 shows media control devices according to non-limiting embodiments
  • FIG. 5 shows a flow diagram for a method for controlling a video conferencing application according to non-limiting embodiments.
  • FIG. 6 illustrates example components of a computing device used in connection with non-limiting embodiments.
  • the term “communication” may refer to the reception, receipt, transmission, transfer, provision, and/or the like of data (e.g., information, signals, messages, instructions, commands, and/or the like).
  • data e.g., information, signals, messages, instructions, commands, and/or the like.
  • one unit e.g., a device, a system, a component of a device or system, combinations thereof, and/or the like
  • this may refer to a direct or indirect connection (e.g., a direct communication connection, an indirect communication connection, and/or the like) that is wired and/or wireless in nature.
  • two units may be in communication with each other even though the information transmitted may be modified, processed, relayed, and/or routed between the first and second unit.
  • a first unit may be in communication with a second unit even though the first unit passively receives information and does not actively transmit information to the second unit.
  • a first unit may be in communication with a second unit if at least one intermediary unit processes information received from the first unit and communicates the processed information to the second unit.
  • computing device may refer to one or more electronic devices configured to process data.
  • a computing device may, in some examples, include the necessary components to receive, process, and output data, such as a display, a processor, a memory, an input device, and a network interface.
  • a computing device may be a mobile device.
  • a mobile device may include a cellular phone (e.g., a smartphone or standard cellular phone), a portable computer, a wearable device (e.g., watches, glasses, lenses, clothing, and/or the like), a personal digital assistant (PDA), and/or other like devices.
  • the computing device may also be a desktop computer or other form of non-mobile computer.
  • the media control system 1000 includes a media control device 100 in communication with a separate computing device 105 .
  • the media control device 100 includes a housing 101 having a user interface 102 .
  • the user interface 102 includes several buttons 104 . It will be appreciated that the user interface 102 may include one or more touch screens, switches, dials, and/or any other type of input device, in addition to or alternatively to the buttons 104 .
  • the buttons 104 represent different functions for a video conferencing application executed by the separate computing device 105 . For example, each button 104 may be configured to cause a different function on a video conferencing application.
  • video conferencing application refers to any software application that provides for audio and video communication between two or more computing devices, such as but not limited to video conferencing platforms including Zoom®, Skype®, Teams®, BlueJeans®, and/or the like.
  • the separate computing device 105 may be any type of computing device including a processor, such as a desktop computer, laptop computer, tablet computer, smartphone, server computer, and/or the like.
  • the computing device 105 may be in communication with the media control device 100 via a wired or wireless connection, such as a USB cable, a WiFi® connection, a Bluetooth® connection, and/or the like.
  • the computing device 105 may include or be in communication with a data storage device 107 including applications that execute on the computing device 105 .
  • the data storage device 107 may also store one or more device drivers, such as a device driver for facilitating communication between the computing device 105 and the media control device 100 .
  • a device driver may be executed by an operating system of the computing device 105 such that signals from the media control device 100 are interpreted by the computing device 105 and/or applications being executed on the computing device 105 .
  • the data storage device 107 and/or another data storage device in local or remote communication with the computing device 105 may store input configurations for each of a plurality of different video conferencing applications. For example, a map of keyboard and/or mouse commands may be mapped to functions within a video conferencing application. The input configurations may be stored with respect to each video conferencing application and/or may be separately stored as aggregated data in a database.
  • a controller 106 is arranged inside the housing 101 .
  • the controller 106 may be any device configured to receive inputs from the user interface 102 and communicate signals based on those inputs to the separate computing device 105 .
  • the controller 106 may be an encoder configured to encode signals from the user interface 102 .
  • the controller 106 may include a circuit board including an integrated circuit (IC) programmed to encode user inputs from the buttons 104 to digital signals.
  • the controller 106 may include, for example, a USB encoder chip, a network controller chip, and/or the like.
  • the controller 106 may be a computing device including a processor, such as a Central Processing Unit (CPU) or a microprocessor.
  • the controller 106 generates output signals that are output through a wired or wireless connection.
  • the example shown in FIG. 2 includes a cable 108 , such as a USB cable, for outputting signals.
  • the media control device 100 includes buttons 104 or other means of user input that can be customized and mapped to different functions in a video conferencing application.
  • buttons 104 or other means of user input that can be customized and mapped to different functions in a video conferencing application.
  • users are provided with instant access to the features of a video conferencing application without the necessary amount of physical movement.
  • a user may wish to execute several different functions of a video conferencing application, such as switching to a full screen mode, switching to a grid view, sharing one or more screens, muting a microphone, muting a speaker, turning video on or off, and/or the like.
  • These features can be activated swiftly and without undesired movements, allowing users such as teachers and other presenters to seamlessly present and interact with an audience.
  • a media control device 100 may have preset buttons and customizable buttons. In other examples, all of the buttons or inputs may be customizable.
  • a user may customize the buttons 104 through a separate computing device, such as through a software application on the computing device 105 that adapts settings for a device driver for the media control device.
  • a user may also customize the buttons through the media control device 100 itself or through a separate device (e.g., such as a smartphone) in communication with the media control device 100 .
  • the media control device 100 may include memory configured to store the settings in a manner that is accessible to the controller 106 .
  • the media control device 100 may issue commands to the video conferencing application in several different ways.
  • the user inputs may be encoded as keyboard inputs, such as one or more key presses, by the media control device 100 or by a device driver executing on the computing device 105 .
  • the processing of the user inputs may be performed by the media control device 100 , such as controller 106 , the separate computing device 105 , and/or a combination thereof.
  • the user inputs may be mapped to communicate via an Application Programming Interface (API) associated with the video conferencing application.
  • API Application Programming Interface
  • the user inputs may be mapped to a sequence of other inputs, such as a combination of keystrokes, mouse movements, and/or the like.
  • the configuration of the user inputs may be modified based on the video conferencing application being used.
  • a device driver, the media control device 100 , and/or an application on the separate computing device 105 may determine preset or customized shortcut configurations for a particular video conferencing application.
  • the corresponding configurations may be used to map the signals from the media control device 100 to signals (e.g., such as key strokes) that are recognized by the relevant video conferencing platform.
  • signals e.g., such as key strokes
  • FIGS. 3 A- 3 F shown are views of a media control device 100 according to non-limiting embodiments.
  • FIG. 3 A shows a front perspective view.
  • FIG. 3 B shows a back perspective view.
  • FIG. 3 C shows a front view.
  • FIG. 3 D shows a back view.
  • FIG. 3 E shows a left side view.
  • FIG. 3 F shows a right side view.
  • the media control device may include any number of buttons or other input devices and may be shaped in various ways. Different types of users may require more buttons than others. In some examples, buttons may be mapped to other software functions that users wish to activate during a presentation, such as actions in PowerPoint® or Windows®.
  • a flow diagram for a method for controlling a video conferencing application is shown according to non-limiting embodiments.
  • the method shown in FIG. 5 is for example purposes only. It will be appreciated that non-limiting embodiments of the method may include fewer, additional, different, and/or a different order of steps than shown in FIG. 5 .
  • a video conferencing application is detected. Detecting a video conferencing application may include, for example, monitoring active applications being executed on a computing device. In some examples, detecting a video conferencing application may include checking whether a predetermined list of applications is being executed on a computing device.
  • a video conferencing application is detected at step 502 , the method proceeds to step 504 and an input configuration is determined based on the detected video conferencing application. For example, if a first video conferencing application is detected, input configurations for the first video conferencing application may be retrieved from the video conferencing application itself, from a data file associated with the video conferencing application, and/or from an aggregated set of input configurations for various different video conferencing applications.
  • the user may operate the media control device by providing user input (e.g., selecting buttons and/or the like) and the computing device may receive signals from user inputs at step 506 . The signals may be associated with different buttons or other selectable options on the media control device.
  • the signals received from the media control device are mapped to application input signals. For example, based on the input configuration determined at step 504 , a particular signal may be mapped to a keystroke or other action used for controlling the video conferencing application. As an example, if a first video conferencing application has an input configuration in which a “screen share” functionality is activated by pressing “SHIFT+S”, a button for “screen share” on the media control device may produce a signal that is mapped to “SHIFT+S”.
  • a button for “mute” on the media control device may produce a signal that is mapped to “F10”.
  • application input signals may include keystrokes, mouse movements, macros, operating system-level commands, and/or the like.
  • the video conferencing application may be controlled based on the application input signal.
  • the computing device through the operating system or an application executing thereon) may produce an emulated keyboard signal by sending an operating system-level command for “SHIFT+S”, “F10”, a mouse action, and/or the like.
  • device 900 may include additional components, fewer components, different components, or differently arranged components than those shown in FIG. 6 .
  • Device 900 may include a bus 902 , a processor 904 , memory 906 , a storage component 908 , an input component 910 , an output component 912 , and a communication interface 914 .
  • Bus 902 may include a component that permits communication among the components of device 900 .
  • processor 904 may be implemented in hardware, firmware, or a combination of hardware and software.
  • processor 904 may include a processor (e.g., a central processing unit (CPU), a graphics processing unit (GPU), an accelerated processing unit (APU), etc.), a microprocessor, a digital signal processor (DSP), and/or any processing component (e.g., a field-programmable gate array (FPGA), an application-specific integrated circuit (ASIC), etc.) that can be programmed to perform a function.
  • Memory 906 may include random access memory (RAM), read only memory (ROM), and/or another type of dynamic or static storage device (e.g., flash memory, magnetic memory, optical memory, etc.) that stores information and/or instructions for use by processor 904 .
  • RAM random access memory
  • ROM read only memory
  • static storage device e.g., flash memory, magnetic memory, optical memory, etc.
  • storage component 908 may store information and/or software related to the operation and use of device 900 .
  • storage component 908 may include a hard disk (e.g., a magnetic disk, an optical disk, a magneto-optic disk, a solid state disk, etc.) and/or another type of computer-readable medium.
  • Input component 910 may include a component that permits device 900 to receive information, such as via user input (e.g., a touch screen display, a keyboard, a keypad, a mouse, a button, a switch, a microphone, etc.).
  • input component 910 may include a sensor for sensing information (e.g., a global positioning system (GPS) component, an accelerometer, a gyroscope, an actuator, etc.).
  • Output component 912 may include a component that provides output information from device 900 (e.g., a display, a speaker, one or more light-emitting diodes (LEDs), etc.).
  • Communication interface 914 may include a transceiver-like component (e.g., a transceiver, a separate receiver and transmitter, etc.) that enables device 900 to communicate with other devices, such as via a wired connection, a wireless connection, or a combination of wired and wireless connections.
  • Communication interface 914 may permit device 900 to receive information from another device and/or provide information to another device.
  • communication interface 914 may include an Ethernet interface, an optical interface, a coaxial interface, an infrared interface, a radio frequency (RF) interface, a universal serial bus (USB) interface, a Wi-Fi® interface, a cellular network interface, and/or the like.
  • RF radio frequency
  • USB universal serial bus
  • Device 900 may perform one or more processes described herein. Device 900 may perform these processes based on processor 904 executing software instructions stored by a computer-readable medium, such as memory 906 and/or storage component 908 .
  • a computer-readable medium may include any non-transitory memory device.
  • a memory device includes memory space located inside of a single physical storage device or memory space spread across multiple physical storage devices.
  • Software instructions may be read into memory 906 and/or storage component 908 from another computer-readable medium or from another device via communication interface 914 . When executed, software instructions stored in memory 906 and/or storage component 908 may cause processor 904 to perform one or more processes described herein.
  • hardwired circuitry may be used in place of or in combination with software instructions to perform one or more processes described herein. Thus, embodiments described herein are not limited to any specific combination of hardware circuitry and software.
  • the term “programmed or configured,” as used herein, refers to an arrangement of software, hardware circuitry, or any combination thereof on one or more devices.

Abstract

Provided is a media control device system including a media control device including: a user interface configured to receive user input; and a communication interface configured to establish communication with a separate computing device. The system also includes a computer-readable non-transitory medium including program instructions that, when executed by a processor of the separate computing device, cause the separate computing device to detect a video conferencing application being executed by the separate computing device, in response to detecting the video conferencing application, determine an input configuration of the video conferencing application, receive a plurality of signals from the media control device via the communication interface, the plurality of signals representing a plurality of different user inputs, map the plurality of signals to a plurality of video conferencing application input signals based on the input configuration, and control the video conferencing application based on the video conferencing application input signals.

Description

CROSS REFERENCE TO RELATED APPLICATION
The present application claims the benefit of U.S. Provisional Patent Application No. 63/155,403, filed on Mar. 2, 2021, the entire disclosure of which is incorporated by reference in its entirety.
BACKGROUND 1. Field
This disclosure relates generally to a media control device and medial control system.
2. Technical Considerations
Video conferencing applications, such as Zoom®, Skype®, Teams®, BlueJeans®, and/or the like, are used by individuals to conduct virtual meetings and are controlled through software settings specific to each application or platform. For example, a user that is speaking or otherwise presenting via one of these applications may have to locate their mouse, search through options, and/or otherwise provide inputs that disrupt the flow of the presentation and distract the presenter and audience.
SUMMARY
According to non-limiting embodiments or aspects, provided is a media control device system comprising: (a) a media control device comprising: a user interface configured to receive user input; and a communication interface configured to establish communication with a separate computing device; and (b) a computer-readable non-transitory medium including program instructions that, when executed by a processor of the separate computing device, cause the separate computing device to: detect a video conferencing application being executed by the separate computing device; in response to detecting the video conferencing application, determining an input configuration of the video conferencing application; receive a plurality of signals from the media control device via the communication interface, the plurality of signals representing a plurality of different user inputs; map the plurality of signals to a plurality of video conferencing application input signals based on the input configuration; and control the video conferencing application based on the video conferencing application input signals.
In non-limiting embodiments or aspects, the media control device further comprises a housing, and the user interface comprises a plurality of buttons arranged on the housing. In non-limiting embodiments or aspects, each button of the plurality of buttons is associated with a different function of the video conferencing application. In non-limiting embodiments or aspects, the program instructions are part of a device driver executing on the separate computing device. In non-limiting embodiments or aspects, the video conferencing application is controlled based on the at least one video conferencing application input signal by emulating, by the processor, the at least one video conferencing application input signal. In non-limiting embodiments or aspects, emulating the at least one video conferencing application input signal comprises generating at least one operating system-level command.
According to non-limiting embodiments or aspects, provided is a media control device comprising: a housing; a user interface configured to receive user input arranged on the housing; a communication interface configured to establish communication with a separate computing device; and a controller arranged in the housing and in communication with the user interface and the communication interface, the controller configured to: generate signals based on the user input, and control a video conferencing application executing on the separate computing device with the signals.
In non-limiting embodiments or aspects, the user interface comprises a plurality of buttons. In non-limiting embodiments or aspects, each button of the plurality of buttons is associated with a different function of the video conferencing application. In non-limiting embodiments or aspects, the controller generates the signals by converting the user input to keyboard inputs recognized by the video conferencing application.
According to non-limiting embodiments or aspects, provided is a media control method comprising: detecting, with a computing device, a video conferencing application being executed by the computing device; in response to detecting the video conferencing application, determining an input configuration of the video conferencing application; receiving at least one signal from a media control device via a communication interface of the media control device, the at least one signal representing a user input on the media control device of a plurality of possible user inputs, the media control device is separate from the computing device; mapping the at least one signal to at least one video conferencing application input signal based on the input configuration; and controlling the video conferencing application based on the at least one video conferencing application input signal.
In non-limiting embodiments or aspects, the media control device comprises a housing and a user interface, the user interface comprising a plurality of buttons arranged on the housing. In non-limiting embodiments or aspects, each button of the plurality of buttons produces a different signal when actuated, the method further including mapping each different signal to a different video conferencing application input signal based on the input configuration. In non-limiting embodiments or aspects, each button of the plurality of buttons is associated with a different function of the video conferencing application. In non-limiting embodiments or aspects, a device driver installed on the computing device maps the at least one signal to the at least one video conferencing application input signal and controls the video conferencing application. In non-limiting embodiments or aspects, controlling the video conferencing application based on the at least one video conferencing application input signal comprises emulating, with the computing device, the at least one video conferencing application input signal. In non-limiting embodiments or aspects, emulating the at least one video conferencing application input signal comprises generating at least one operating system-level command.
These and other features and characteristics of the present disclosure, as well as the methods of operation and functions of the related elements of structures and the combination of parts and economies of manufacture, will become more apparent upon consideration of the following description and the appended claims with reference to the accompanying drawings, all of which form a part of this specification, wherein like reference numerals designate corresponding parts in the various figures. It is to be expressly understood, however, that the drawings are for the purpose of illustration and description only and are not intended as a definition of the limits of the invention. As used in the specification and the claims, the singular form of “a,” “an,” and “the” include plural referents unless the context clearly dictates otherwise.
BRIEF DESCRIPTION OF THE DRAWINGS
Additional advantages and details are explained in greater detail below with reference to the exemplary embodiments that are illustrated in the accompanying schematic figures, in which:
FIG. 1 is a schematic diagram of a media control system according to a non-limiting embodiment;
FIG. 2 is an exploded view of a media control device according to a non-limiting embodiment;
FIGS. 3A-3F show a media control device according to non-limiting embodiments;
FIG. 4 shows media control devices according to non-limiting embodiments;
FIG. 5 shows a flow diagram for a method for controlling a video conferencing application according to non-limiting embodiments; and
FIG. 6 illustrates example components of a computing device used in connection with non-limiting embodiments.
DESCRIPTION
For purposes of the description hereinafter, the terms “end,” “upper,” “lower,” “right,” “left,” “vertical,” “horizontal,” “top,” “bottom,” “lateral,” “longitudinal,” and derivatives thereof shall relate to the embodiments as they are oriented in the drawing figures. However, it is to be understood that the embodiments may assume various alternative variations and step sequences, except where expressly specified to the contrary. It is also to be understood that the specific devices and processes illustrated in the attached drawings, and described in the following specification, are simply exemplary embodiments or aspects of the invention. Hence, specific dimensions and other physical characteristics related to the embodiments or aspects disclosed herein are not to be considered as limiting. All numbers used in the specification and claims are to be understood as being modified in all instances by the term “about.” The terms “approximately,” “about,” and “substantially” mean a range of plus or minus ten percent of the stated value.
As used herein, the term “communication” may refer to the reception, receipt, transmission, transfer, provision, and/or the like of data (e.g., information, signals, messages, instructions, commands, and/or the like). For one unit (e.g., a device, a system, a component of a device or system, combinations thereof, and/or the like) to be in communication with another unit means that the one unit is able to directly or indirectly receive information from and/or transmit information to the other unit. This may refer to a direct or indirect connection (e.g., a direct communication connection, an indirect communication connection, and/or the like) that is wired and/or wireless in nature. Additionally, two units may be in communication with each other even though the information transmitted may be modified, processed, relayed, and/or routed between the first and second unit. For example, a first unit may be in communication with a second unit even though the first unit passively receives information and does not actively transmit information to the second unit. As another example, a first unit may be in communication with a second unit if at least one intermediary unit processes information received from the first unit and communicates the processed information to the second unit.
As used herein, the term “computing device” may refer to one or more electronic devices configured to process data. A computing device may, in some examples, include the necessary components to receive, process, and output data, such as a display, a processor, a memory, an input device, and a network interface. A computing device may be a mobile device. As an example, a mobile device may include a cellular phone (e.g., a smartphone or standard cellular phone), a portable computer, a wearable device (e.g., watches, glasses, lenses, clothing, and/or the like), a personal digital assistant (PDA), and/or other like devices. The computing device may also be a desktop computer or other form of non-mobile computer.
Referring to FIG. 1 , a media control system 1000 is shown according to a non-limiting embodiment. The media control system 1000 includes a media control device 100 in communication with a separate computing device 105. The media control device 100 includes a housing 101 having a user interface 102. The user interface 102 includes several buttons 104. It will be appreciated that the user interface 102 may include one or more touch screens, switches, dials, and/or any other type of input device, in addition to or alternatively to the buttons 104. The buttons 104 represent different functions for a video conferencing application executed by the separate computing device 105. For example, each button 104 may be configured to cause a different function on a video conferencing application.
As used herein, the term “video conferencing application” refers to any software application that provides for audio and video communication between two or more computing devices, such as but not limited to video conferencing platforms including Zoom®, Skype®, Teams®, BlueJeans®, and/or the like.
Still referring to FIG. 1 , the separate computing device 105 may be any type of computing device including a processor, such as a desktop computer, laptop computer, tablet computer, smartphone, server computer, and/or the like. The computing device 105 may be in communication with the media control device 100 via a wired or wireless connection, such as a USB cable, a WiFi® connection, a Bluetooth® connection, and/or the like. The computing device 105 may include or be in communication with a data storage device 107 including applications that execute on the computing device 105. The data storage device 107 may also store one or more device drivers, such as a device driver for facilitating communication between the computing device 105 and the media control device 100. As an example, a device driver may be executed by an operating system of the computing device 105 such that signals from the media control device 100 are interpreted by the computing device 105 and/or applications being executed on the computing device 105. In non-limiting examples, the data storage device 107 and/or another data storage device in local or remote communication with the computing device 105 may store input configurations for each of a plurality of different video conferencing applications. For example, a map of keyboard and/or mouse commands may be mapped to functions within a video conferencing application. The input configurations may be stored with respect to each video conferencing application and/or may be separately stored as aggregated data in a database.
Referring now to FIG. 2 , an exploded view of a media control device 100 is shown according to a non-limiting embodiment. A controller 106 is arranged inside the housing 101. The controller 106 may be any device configured to receive inputs from the user interface 102 and communicate signals based on those inputs to the separate computing device 105. For example, the controller 106 may be an encoder configured to encode signals from the user interface 102. In some examples, the controller 106 may include a circuit board including an integrated circuit (IC) programmed to encode user inputs from the buttons 104 to digital signals. The controller 106 may include, for example, a USB encoder chip, a network controller chip, and/or the like. In some examples, the controller 106 may be a computing device including a processor, such as a Central Processing Unit (CPU) or a microprocessor. The controller 106 generates output signals that are output through a wired or wireless connection. The example shown in FIG. 2 includes a cable 108, such as a USB cable, for outputting signals.
In non-limiting embodiments, the media control device 100 includes buttons 104 or other means of user input that can be customized and mapped to different functions in a video conferencing application. By including several buttons or other user inputs on a media control device 100 separate from another computing device used to execute the video conferencing application, users are provided with instant access to the features of a video conferencing application without the necessary amount of physical movement. For example, during a video conference a user may wish to execute several different functions of a video conferencing application, such as switching to a full screen mode, switching to a grid view, sharing one or more screens, muting a microphone, muting a speaker, turning video on or off, and/or the like. These features can be activated swiftly and without undesired movements, allowing users such as teachers and other presenters to seamlessly present and interact with an audience.
In some examples, a media control device 100 may have preset buttons and customizable buttons. In other examples, all of the buttons or inputs may be customizable. A user may customize the buttons 104 through a separate computing device, such as through a software application on the computing device 105 that adapts settings for a device driver for the media control device. A user may also customize the buttons through the media control device 100 itself or through a separate device (e.g., such as a smartphone) in communication with the media control device 100. In such examples, the media control device 100 may include memory configured to store the settings in a manner that is accessible to the controller 106.
The media control device 100 may issue commands to the video conferencing application in several different ways. For example, in some non-limiting embodiments, the user inputs may be encoded as keyboard inputs, such as one or more key presses, by the media control device 100 or by a device driver executing on the computing device 105. The processing of the user inputs may be performed by the media control device 100, such as controller 106, the separate computing device 105, and/or a combination thereof. In some non-limiting embodiments, the user inputs may be mapped to communicate via an Application Programming Interface (API) associated with the video conferencing application. In some non-limiting embodiments, the user inputs may be mapped to a sequence of other inputs, such as a combination of keystrokes, mouse movements, and/or the like.
In non-limiting embodiments, the configuration of the user inputs may be modified based on the video conferencing application being used. For example, a device driver, the media control device 100, and/or an application on the separate computing device 105 may determine preset or customized shortcut configurations for a particular video conferencing application. In response to determining what video conferencing application is being executed (e.g., Zoom® or Skype®), the corresponding configurations may be used to map the signals from the media control device 100 to signals (e.g., such as key strokes) that are recognized by the relevant video conferencing platform. Thus, even if shortcuts for “full screen mode” are different keys in different video conferencing applications, a user only needs to press a single button 104 or input on the media control device 100 while using either video conferencing application.
Referring now to FIGS. 3A-3F, shown are views of a media control device 100 according to non-limiting embodiments. FIG. 3A shows a front perspective view. FIG. 3B shows a back perspective view. FIG. 3C shows a front view. FIG. 3D shows a back view. FIG. 3E shows a left side view. FIG. 3F shows a right side view.
Referring now to FIG. 4 , configurations and arrangements of media control devices according to different embodiments. As shown, the media control device may include any number of buttons or other input devices and may be shaped in various ways. Different types of users may require more buttons than others. In some examples, buttons may be mapped to other software functions that users wish to activate during a presentation, such as actions in PowerPoint® or Windows®.
Referring now to FIG. 5 , a flow diagram for a method for controlling a video conferencing application is shown according to non-limiting embodiments. The method shown in FIG. 5 is for example purposes only. It will be appreciated that non-limiting embodiments of the method may include fewer, additional, different, and/or a different order of steps than shown in FIG. 5 . At step 500, a video conferencing application is detected. Detecting a video conferencing application may include, for example, monitoring active applications being executed on a computing device. In some examples, detecting a video conferencing application may include checking whether a predetermined list of applications is being executed on a computing device. At step 502, it is determined whether a video conferencing application has been detected. If no detection has been made, the method proceeds back to step 500 and the method will continually and/or periodically detect a video conferencing application.
With continued reference to FIG. 5 , if a video conferencing application is detected at step 502, the method proceeds to step 504 and an input configuration is determined based on the detected video conferencing application. For example, if a first video conferencing application is detected, input configurations for the first video conferencing application may be retrieved from the video conferencing application itself, from a data file associated with the video conferencing application, and/or from an aggregated set of input configurations for various different video conferencing applications. After the input configuration is determined for the detected video conferencing application, the user may operate the media control device by providing user input (e.g., selecting buttons and/or the like) and the computing device may receive signals from user inputs at step 506. The signals may be associated with different buttons or other selectable options on the media control device.
Still referring to FIG. 5 , at step 508, the signals received from the media control device are mapped to application input signals. For example, based on the input configuration determined at step 504, a particular signal may be mapped to a keystroke or other action used for controlling the video conferencing application. As an example, if a first video conferencing application has an input configuration in which a “screen share” functionality is activated by pressing “SHIFT+S”, a button for “screen share” on the media control device may produce a signal that is mapped to “SHIFT+S”. As another example, if a first video conferencing application has an input configuration in which a microphone is muted by pressing “F10” on a keyboard, a button for “mute” on the media control device may produce a signal that is mapped to “F10”. In some non-limiting embodiments, application input signals may include keystrokes, mouse movements, macros, operating system-level commands, and/or the like. At step 510, after mapping one or more signals from the media controller to an application input signal, the video conferencing application may be controlled based on the application input signal. For example, the computing device (through the operating system or an application executing thereon) may produce an emulated keyboard signal by sending an operating system-level command for “SHIFT+S”, “F10”, a mouse action, and/or the like.
Referring now to FIG. 6 , shown is a diagram of example components of a computing device 900 for implementing and performing the systems and methods described herein according to non-limiting embodiments. In some non-limiting embodiments, device 900 may include additional components, fewer components, different components, or differently arranged components than those shown in FIG. 6 . Device 900 may include a bus 902, a processor 904, memory 906, a storage component 908, an input component 910, an output component 912, and a communication interface 914. Bus 902 may include a component that permits communication among the components of device 900. In some non-limiting embodiments, processor 904 may be implemented in hardware, firmware, or a combination of hardware and software. For example, processor 904 may include a processor (e.g., a central processing unit (CPU), a graphics processing unit (GPU), an accelerated processing unit (APU), etc.), a microprocessor, a digital signal processor (DSP), and/or any processing component (e.g., a field-programmable gate array (FPGA), an application-specific integrated circuit (ASIC), etc.) that can be programmed to perform a function. Memory 906 may include random access memory (RAM), read only memory (ROM), and/or another type of dynamic or static storage device (e.g., flash memory, magnetic memory, optical memory, etc.) that stores information and/or instructions for use by processor 904.
With continued reference to FIG. 6 , storage component 908 may store information and/or software related to the operation and use of device 900. For example, storage component 908 may include a hard disk (e.g., a magnetic disk, an optical disk, a magneto-optic disk, a solid state disk, etc.) and/or another type of computer-readable medium. Input component 910 may include a component that permits device 900 to receive information, such as via user input (e.g., a touch screen display, a keyboard, a keypad, a mouse, a button, a switch, a microphone, etc.). Additionally, or alternatively, input component 910 may include a sensor for sensing information (e.g., a global positioning system (GPS) component, an accelerometer, a gyroscope, an actuator, etc.). Output component 912 may include a component that provides output information from device 900 (e.g., a display, a speaker, one or more light-emitting diodes (LEDs), etc.). Communication interface 914 may include a transceiver-like component (e.g., a transceiver, a separate receiver and transmitter, etc.) that enables device 900 to communicate with other devices, such as via a wired connection, a wireless connection, or a combination of wired and wireless connections. Communication interface 914 may permit device 900 to receive information from another device and/or provide information to another device. For example, communication interface 914 may include an Ethernet interface, an optical interface, a coaxial interface, an infrared interface, a radio frequency (RF) interface, a universal serial bus (USB) interface, a Wi-Fi® interface, a cellular network interface, and/or the like.
Device 900 may perform one or more processes described herein. Device 900 may perform these processes based on processor 904 executing software instructions stored by a computer-readable medium, such as memory 906 and/or storage component 908. A computer-readable medium may include any non-transitory memory device. A memory device includes memory space located inside of a single physical storage device or memory space spread across multiple physical storage devices. Software instructions may be read into memory 906 and/or storage component 908 from another computer-readable medium or from another device via communication interface 914. When executed, software instructions stored in memory 906 and/or storage component 908 may cause processor 904 to perform one or more processes described herein. Additionally, or alternatively, hardwired circuitry may be used in place of or in combination with software instructions to perform one or more processes described herein. Thus, embodiments described herein are not limited to any specific combination of hardware circuitry and software. The term “programmed or configured,” as used herein, refers to an arrangement of software, hardware circuitry, or any combination thereof on one or more devices.
Although non-limiting embodiments have been described in detail for the purpose of illustration based on what is currently considered to be the most practical and preferred embodiments, it is to be understood that such detail is solely for that purpose and that the invention is not limited to the disclosed embodiments, but, on the contrary, is intended to cover modifications and equivalent arrangements that are within the spirit and scope of the appended claims. For example, it is to be understood that the present invention contemplates that, to the extent possible, one or more features of any embodiment can be combined with one or more features of any other embodiment.

Claims (11)

The invention claimed is:
1. A media control device system comprising:
(a) a media control device comprising:
a housing;
a user interface configured to receive user input, the user interface comprising a plurality of buttons arranged on the housing, each button of the plurality of buttons associated with a different video conferencing function; and
a communication interface configured to establish communication with a separate computing device; and
(b) a computer-readable non-transitory medium including program instructions that, when executed by a processor of the separate computing device, cause the separate computing device to:
detect a video conferencing application being executed by the separate computing device, wherein the separate computing device is separate from the media control device;
in response to detecting the video conferencing application, determine an input configuration of the video conferencing application from a plurality of input configurations associated with a plurality of different video conferencing applications;
receive a plurality of signals from the media control device via the communication interface, the plurality of signals representing a plurality of different user inputs engaging at least one button of the plurality of buttons;
map the plurality of signals to a plurality of video conferencing application input signals based on the input configuration; and
control the video conferencing application based on the video conferencing application input signals.
2. The media control device system of claim 1, wherein the program instructions are part of a device driver executing on the separate computing device.
3. The media control device system of claim 1, wherein the video conferencing application is controlled based on the video conferencing application input signals by emulating, by the processor, the at least one video conferencing application input signal.
4. The media control device system of claim 3, wherein emulating the at least one video conferencing application input signal comprises generating at least one operating system-level command.
5. A media control device comprising:
a housing;
a user interface configured to receive user input arranged on the housing, the user interface comprising a plurality of buttons, each button of the plurality of buttons associated with a different video conferencing function;
a communication interface configured to establish communication with a separate computing device, the separate computing device separate from the housing; and
a controller arranged in the housing and in communication with the user interface and the communication interface, the controller configured to: generate signals based on the user input and control a plurality of different video conferencing applications executing on the separate computing device with the signals, the plurality of different video conferencing applications each associated with different input configurations, such that the signals are mapped to an input configuration of the different input configurations based on a video conferencing application of the plurality of different video conferencing applications.
6. The media control device of claim 5, wherein the controller generates the signals by converting the user input to keyboard inputs recognized by the video conferencing application.
7. A media control method comprising:
detecting, with a computing device, a video conferencing application being executed by the computing device;
in response to detecting the video conferencing application, determining, with the computing device, an input configuration of the video conferencing application from a plurality of input configurations associated with a plurality of different video conferencing applications;
receiving, with the computing device, at least one signal from a media control device separate from the computing device via a communication interface of the media control device, the at least one signal representing a user input on the media control device of a plurality of possible user inputs, wherein the user input comprises selection of at least one button of a plurality of buttons arranged on a housing of the media control device;
mapping the at least one signal to at least one video conferencing application input signal based on the input configuration; and
controlling the video conferencing application based on the at least one video conferencing application input signal.
8. The media control method of claim 7, wherein each button of the plurality of buttons produces a different signal when actuated, the method further comprising mapping each different signal to a different video conferencing application input signal based on the input configuration.
9. The media control method of claim 7, wherein a device driver installed on the computing device maps the at least one signal to the at least one video conferencing application input signal and controls the video conferencing application.
10. The media control method of claim 7, wherein controlling the video conferencing application based on the at least one video conferencing application input signal comprises emulating, with the computing device, the at least one video conferencing application input signal.
11. The media control method of claim 10, wherein emulating the at least one video conferencing application input signal comprises generating at least one operating system-level command.
US17/683,516 2021-03-02 2022-03-01 Media control device and system Active 2042-06-08 US11910130B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/683,516 US11910130B2 (en) 2021-03-02 2022-03-01 Media control device and system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202163155403P 2021-03-02 2021-03-02
US17/683,516 US11910130B2 (en) 2021-03-02 2022-03-01 Media control device and system

Publications (2)

Publication Number Publication Date
US20220286646A1 US20220286646A1 (en) 2022-09-08
US11910130B2 true US11910130B2 (en) 2024-02-20

Family

ID=83117627

Family Applications (1)

Application Number Title Priority Date Filing Date
US17/683,516 Active 2042-06-08 US11910130B2 (en) 2021-03-02 2022-03-01 Media control device and system

Country Status (1)

Country Link
US (1) US11910130B2 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11910130B2 (en) * 2021-03-02 2024-02-20 Carnegie Mellon University Media control device and system

Citations (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4686698A (en) * 1985-04-08 1987-08-11 Datapoint Corporation Workstation for interfacing with a video conferencing network
US4716585A (en) * 1985-04-05 1987-12-29 Datapoint Corporation Gain switched audio conferencing network
US5014267A (en) * 1989-04-06 1991-05-07 Datapoint Corporation Video conferencing network
US5574778A (en) * 1995-05-01 1996-11-12 Bell Communications Research, Inc. Method and apparatus for providing video services
US20020044201A1 (en) * 1998-01-06 2002-04-18 Intel Corporation Method and apparatus for controlling a remote video camera in a video conferencing system
US6380990B1 (en) * 1997-10-06 2002-04-30 Sony Corporation Method and apparatus for command and control of television receiver for video conferencing applications
US6489986B1 (en) * 2000-09-29 2002-12-03 Digeo, Inc. Remote control device for video and audio capture and communication
US20030169329A1 (en) * 1994-06-07 2003-09-11 Jeffrey L. Parker Multi-user camera control system and method
US20040212701A1 (en) * 2003-03-13 2004-10-28 Francois Ladouceur Control method and system for a remote video chain
US7221386B2 (en) * 2003-10-07 2007-05-22 Librestream Technologies Inc. Camera for communication of streaming media to a remote client
US20100128105A1 (en) * 2008-11-21 2010-05-27 Polycom, Inc. System and Method for Combining a Plurality of Video Stream Generated in a Videoconference
US7907222B2 (en) * 2005-09-08 2011-03-15 Universal Electronics Inc. System and method for simplified setup of a universal remote control
US8466951B2 (en) * 2009-04-06 2013-06-18 Chicony Electronics Co., Ltd. Wireless digital picture frame with video streaming capabilities
US20130155175A1 (en) * 2011-12-16 2013-06-20 Wayne E. Mock Customizing Input to a Videoconference Using a Remote Control Device
US20130271558A1 (en) * 2009-10-27 2013-10-17 Intaglio, Llc Method of operating a communication system
US20130300820A1 (en) * 2009-04-14 2013-11-14 Huawei Device Co., Ltd. Remote presenting system, device, and method
US20140247318A1 (en) * 2008-09-12 2014-09-04 Centurylink Intellectual Property Llc System and method for initiating a video conferencing through a streaming device
US20140282204A1 (en) * 2013-03-12 2014-09-18 Samsung Electronics Co., Ltd. Key input method and apparatus using random number in virtual keyboard
US8896651B2 (en) * 2011-10-27 2014-11-25 Polycom, Inc. Portable devices as videoconferencing peripherals
US8896654B2 (en) * 2008-03-12 2014-11-25 Dish Network L.L.C. Methods and apparatus for providing chat data and video content between multiple viewers
US8937636B2 (en) * 2012-04-20 2015-01-20 Logitech Europe S.A. Using previous selection information in a user interface having a plurality of icons
US8963982B2 (en) * 2010-12-31 2015-02-24 Skype Communication system and method
US8970658B2 (en) * 2012-04-20 2015-03-03 Logitech Europe S.A. User interface allowing a participant to rejoin a previously left videoconference
US8970653B2 (en) * 2011-06-16 2015-03-03 Vtel Products Corporation, Inc. Video conference control system and method
US8988483B2 (en) * 2011-06-06 2015-03-24 Ted Schwartz Mobile conferencing system
US20150201160A1 (en) * 2014-01-10 2015-07-16 Revolve Robotics, Inc. Systems and methods for controlling robotic stands during videoconference operation
US20150296177A1 (en) * 2012-11-26 2015-10-15 Intouch Technologies, Inc. Enhanced video interaction for a user interface of a telepresence network
US20160070368A1 (en) * 2014-09-05 2016-03-10 Samsung Electronics Co., Ltd. Method for controlling user interface and electronic device supporting the same
US20160191575A1 (en) * 2014-12-30 2016-06-30 Microsoft Technology Licensing, Llc Bridge Device for Large Meetings
US20170329486A1 (en) * 2015-05-29 2017-11-16 Tencent Technology (Shenzhen) Company Limited Method and device for interaction between terminals
US20170357387A1 (en) * 2016-06-12 2017-12-14 Apple Inc. Identifying applications on which content is available
US20180167578A1 (en) * 2016-12-09 2018-06-14 NetTalk.com, Inc. Method and Apparatus for Coviewing Video
US20190369827A1 (en) * 2018-06-03 2019-12-05 Apple Inc. Remote data input framework
US10751612B1 (en) * 2019-04-05 2020-08-25 Sony Interactive Entertainment LLC Media multi-tasking using remote device
US20210019982A1 (en) * 2016-10-13 2021-01-21 Skreens Entertainment Technologies, Inc. Systems and methods for gesture recognition and interactive video assisted gambling
US11006071B2 (en) * 2016-02-24 2021-05-11 Iron Bow Technologies, LLC Integrated telemedicine device
US20220286646A1 (en) * 2021-03-02 2022-09-08 Carnegie Mellon University Media Control Device and System

Patent Citations (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4716585A (en) * 1985-04-05 1987-12-29 Datapoint Corporation Gain switched audio conferencing network
US4686698A (en) * 1985-04-08 1987-08-11 Datapoint Corporation Workstation for interfacing with a video conferencing network
US5014267A (en) * 1989-04-06 1991-05-07 Datapoint Corporation Video conferencing network
US20030169329A1 (en) * 1994-06-07 2003-09-11 Jeffrey L. Parker Multi-user camera control system and method
US5574778A (en) * 1995-05-01 1996-11-12 Bell Communications Research, Inc. Method and apparatus for providing video services
US6380990B1 (en) * 1997-10-06 2002-04-30 Sony Corporation Method and apparatus for command and control of television receiver for video conferencing applications
US20020044201A1 (en) * 1998-01-06 2002-04-18 Intel Corporation Method and apparatus for controlling a remote video camera in a video conferencing system
US6489986B1 (en) * 2000-09-29 2002-12-03 Digeo, Inc. Remote control device for video and audio capture and communication
US20040212701A1 (en) * 2003-03-13 2004-10-28 Francois Ladouceur Control method and system for a remote video chain
US7221386B2 (en) * 2003-10-07 2007-05-22 Librestream Technologies Inc. Camera for communication of streaming media to a remote client
US7907222B2 (en) * 2005-09-08 2011-03-15 Universal Electronics Inc. System and method for simplified setup of a universal remote control
US8896654B2 (en) * 2008-03-12 2014-11-25 Dish Network L.L.C. Methods and apparatus for providing chat data and video content between multiple viewers
US20140247318A1 (en) * 2008-09-12 2014-09-04 Centurylink Intellectual Property Llc System and method for initiating a video conferencing through a streaming device
US20100128105A1 (en) * 2008-11-21 2010-05-27 Polycom, Inc. System and Method for Combining a Plurality of Video Stream Generated in a Videoconference
US8466951B2 (en) * 2009-04-06 2013-06-18 Chicony Electronics Co., Ltd. Wireless digital picture frame with video streaming capabilities
US20130300820A1 (en) * 2009-04-14 2013-11-14 Huawei Device Co., Ltd. Remote presenting system, device, and method
US20130271558A1 (en) * 2009-10-27 2013-10-17 Intaglio, Llc Method of operating a communication system
US8963982B2 (en) * 2010-12-31 2015-02-24 Skype Communication system and method
US8988483B2 (en) * 2011-06-06 2015-03-24 Ted Schwartz Mobile conferencing system
US8970653B2 (en) * 2011-06-16 2015-03-03 Vtel Products Corporation, Inc. Video conference control system and method
US8896651B2 (en) * 2011-10-27 2014-11-25 Polycom, Inc. Portable devices as videoconferencing peripherals
US20130155175A1 (en) * 2011-12-16 2013-06-20 Wayne E. Mock Customizing Input to a Videoconference Using a Remote Control Device
US8937636B2 (en) * 2012-04-20 2015-01-20 Logitech Europe S.A. Using previous selection information in a user interface having a plurality of icons
US8970658B2 (en) * 2012-04-20 2015-03-03 Logitech Europe S.A. User interface allowing a participant to rejoin a previously left videoconference
US20150296177A1 (en) * 2012-11-26 2015-10-15 Intouch Technologies, Inc. Enhanced video interaction for a user interface of a telepresence network
US20140282204A1 (en) * 2013-03-12 2014-09-18 Samsung Electronics Co., Ltd. Key input method and apparatus using random number in virtual keyboard
US20150201160A1 (en) * 2014-01-10 2015-07-16 Revolve Robotics, Inc. Systems and methods for controlling robotic stands during videoconference operation
US20160070368A1 (en) * 2014-09-05 2016-03-10 Samsung Electronics Co., Ltd. Method for controlling user interface and electronic device supporting the same
US20160191575A1 (en) * 2014-12-30 2016-06-30 Microsoft Technology Licensing, Llc Bridge Device for Large Meetings
US20170329486A1 (en) * 2015-05-29 2017-11-16 Tencent Technology (Shenzhen) Company Limited Method and device for interaction between terminals
US11006071B2 (en) * 2016-02-24 2021-05-11 Iron Bow Technologies, LLC Integrated telemedicine device
US20170357387A1 (en) * 2016-06-12 2017-12-14 Apple Inc. Identifying applications on which content is available
US11543938B2 (en) * 2016-06-12 2023-01-03 Apple Inc. Identifying applications on which content is available
US20210019982A1 (en) * 2016-10-13 2021-01-21 Skreens Entertainment Technologies, Inc. Systems and methods for gesture recognition and interactive video assisted gambling
US20180167578A1 (en) * 2016-12-09 2018-06-14 NetTalk.com, Inc. Method and Apparatus for Coviewing Video
US20190369827A1 (en) * 2018-06-03 2019-12-05 Apple Inc. Remote data input framework
US10751612B1 (en) * 2019-04-05 2020-08-25 Sony Interactive Entertainment LLC Media multi-tasking using remote device
US20220286646A1 (en) * 2021-03-02 2022-09-08 Carnegie Mellon University Media Control Device and System

Also Published As

Publication number Publication date
US20220286646A1 (en) 2022-09-08

Similar Documents

Publication Publication Date Title
US9690542B2 (en) Scaling digital personal assistant agents across devices
KR102045585B1 (en) Adaptive input language switching
AU2014201435B2 (en) Applications presentation method and system of mobile terminal
US8706920B2 (en) Accessory protocol for touch screen device accessibility
US8599105B2 (en) Method and apparatus for implementing a multiple display mode
US20160147406A1 (en) Method for providing graphical user interface and electronic device for supporting the same
US9720567B2 (en) Multitasking and full screen menu contexts
US20160147429A1 (en) Device for resizing window, and method of controlling the device to resize window
US10564791B2 (en) Method and apparatus for triggering a remote data entry interface
CN103677711A (en) Method for connecting mobile terminal and external display and apparatus implementing the same
US20150067550A1 (en) Dual screen system and method
US10462243B2 (en) Method and device for interaction between terminals
US20140282204A1 (en) Key input method and apparatus using random number in virtual keyboard
JP2018504798A (en) Gesture control method, device, and system
WO2020215969A1 (en) Content input method and terminal device
US20200233551A1 (en) Task switching method and terminal
EP3436889A1 (en) Touch-input support for an external touch-capable display device
KR20220100988A (en) How to move icons and electronic devices
US11910130B2 (en) Media control device and system
CN110012151B (en) Information display method and terminal equipment
WO2015014138A1 (en) Method, device, and equipment for displaying display frame
CN112424744A (en) Electronic device and volume adjusting method of electronic device
CN108196754B (en) Method, terminal and server for displaying object
US20160070368A1 (en) Method for controlling user interface and electronic device supporting the same
KR20130081502A (en) Method and apparatus for scroll displaying of items

Legal Events

Date Code Title Description
AS Assignment

Owner name: CARNEGIE MELLON UNIVERSITY, PENNSYLVANIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:RHEE, JINMO;REEL/FRAME:059131/0531

Effective date: 20220211

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO SMALL (ORIGINAL EVENT CODE: SMAL); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE