US20120287231A1 - Media sharing during a video call - Google Patents
Media sharing during a video call Download PDFInfo
- Publication number
- US20120287231A1 US20120287231A1 US13/470,336 US201213470336A US2012287231A1 US 20120287231 A1 US20120287231 A1 US 20120287231A1 US 201213470336 A US201213470336 A US 201213470336A US 2012287231 A1 US2012287231 A1 US 2012287231A1
- Authority
- US
- United States
- Prior art keywords
- video call
- media content
- video
- signals
- media
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/141—Systems for two-way working between two video terminals, e.g. videophone
- H04N7/148—Interfacing a video terminal to a particular transmission medium, e.g. ISDN
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/633—Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
- H04N23/661—Transmitting camera control signals through networks, e.g. control via the Internet
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W12/00—Security arrangements; Authentication; Protecting privacy or anonymity
- H04W12/50—Secure pairing of devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
- H04N23/611—Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W12/00—Security arrangements; Authentication; Protecting privacy or anonymity
- H04W12/60—Context-dependent security
- H04W12/69—Identity-dependent
- H04W12/77—Graphical identity
Definitions
- the present disclosure relates generally to video calling. More particularly, the present disclosure relates to sharing media during a video call.
- the traditional use of television has been for passive consumption of content.
- the content is mostly television programming (live as well as on-demand) and outputs of other local devices such as media players (for example, DVD, CD, and VCR devices), video game devices, and the like.
- media players for example, DVD, CD, and VCR devices
- video game devices and the like.
- One solution involves connecting a computer to a webcam, speakers, microphone, and the television screen, installing and executing video calling software on the computer, and controlling the computer using a keyboard and mouse. Users generally avoid such tedious tasks.
- an embodiment features a video call device comprising: a video input interface configured to receive first video information; an audio input interface configured to receive first audio information; a transmitter configured to transmit first signals during a video call, wherein the first signals represent the first video information and the first audio information; a receiver configured to receive second signals during the video call, wherein the second signals represent second video information and second audio information; a video output interface configured to provide the second video information; an audio output interface configured to provide the second audio information; wherein the transmitter is further configured to transmit third signals during the video call, wherein the third signals represent at least one of media content, and a hyperlink, wherein the hyperlink indicates a location of the media content.
- Embodiments of the video call device can include one or more of the following features.
- Some embodiments comprise an encoder/decoder (CODEC); wherein the receiver is further configured to receive fourth signals, wherein the fourth signals represent a link partner media quality request, wherein the link partner media quality request indicates a desired quality for the first video information; wherein the CODEC is configured to transcode the first video information according to the desired quality prior to the first signals being transmitted by the transmitter.
- Some embodiments comprise a media interface configured to receive the media content; wherein the third signals represent the media content.
- Some embodiments comprise an encoder/decoder (CODEC); wherein the receiver is further configured to receive fourth signals, wherein the fourth signals represent a link partner media quality request, wherein the link partner media quality request indicates a desired quality for the media content; wherein the CODEC is configured to transcode the media content according to the desired quality prior to the third signals being transmitted by the transmitter.
- the media interface comprises at least one of: an SD card interface; a USB interface; and a mass storage interface.
- Some embodiments comprise a processor configured to generate one or more first playback synchronization commands, wherein the first playback synchronization commands include timing information for playback of the media content; wherein the transmitter is further configured to transmit fourth signals during the video call, wherein the fourth signals represent the one or more first playback synchronization commands.
- the receiver is further configured to receive fifth signals during the video call, wherein the fifth signals represent one or more second playback synchronization commands; and the processor is further configured to control playback of the media content according to the one or more second playback synchronization commands.
- the playback synchronization commands represent at least one of: a file transfer status for the media content; a playback position for the media content; and a time of a modification of the media content by a user.
- Some embodiments comprise one or more cameras configured to provide the first video information to the video input interface; and one or more microphones configured to provide the first audio information to the audio input interface.
- an embodiment features a method comprising: receiving first video information; receiving first audio information; transmitting first signals during a video call, wherein the first signals represent the first video information and the first audio information; receiving second signals during the video call, wherein the second signals represent second video information and second audio information; providing the second video information; providing the second audio information; and transmitting third signals during the video call, wherein the third signals represent at least one of media content, and a hyperlink, wherein the hyperlink indicates a location of the media content.
- Embodiments of the method can include one or more of the following features. Some embodiments comprise receiving fourth signals, wherein the fourth signals represent a link partner media quality request, wherein the link partner media quality request indicates a desired quality for the first video information; and transcoding the first video information according to the desired quality prior to transmitting the first signals. Some embodiments comprise receiving the media content; wherein the third signals represent the media content. Some embodiments comprise receiving fourth signals, wherein the fourth signals represent a link partner media quality request, wherein the link partner media quality request indicates a desired quality for the media content; and transcoding the media content according to the desired quality prior to transmitting the third signals.
- Some embodiments comprise generating one or more first playback synchronization commands, wherein the first playback synchronization commands include timing information for playback of the media content; and transmitting fourth signals during the video call, wherein the fourth signals represent the one or more first playback synchronization commands. Some embodiments comprise receiving fifth signals during the video call, wherein the fifth signals represent one or more second playback synchronization commands; and controlling playback of the media content according to the one or more second playback synchronization commands.
- the playback synchronization commands represent at least one of: a file transfer status for the media content; a playback position for the media content; and a time of a modification of the media content by a user.
- an embodiment features non-transitory computer-readable media embodying instructions executable by a computer to perform functions comprising: receiving first video information and first audio information; causing transmission of first signals during a video call, wherein the first signals represent the first video information and the first audio information; providing second video information and second audio information based on second signals received during the video call, wherein the second signals represent the second video information and the second audio information; causing transmission of third signals during the video call, wherein the third signals represent at least one of media content, and a hyperlink, wherein the hyperlink indicates a location of the media content.
- Embodiments of the non-transitory computer-readable media can include one or more of the following features.
- the functions further comprise: receiving a link partner media quality request, wherein the link partner media quality request indicates a desired quality for the first video information; and transcoding the first video information according to the desired quality prior to causing transmission of the first signals.
- the functions further comprise: receiving the media content; wherein the third signals represent the media content.
- the functions further comprise: receiving a link partner media quality request, wherein the link partner media quality request indicates a desired quality for the media content; and transcoding the media content according to the desired quality prior to causing transmission of the third signals.
- the functions further comprise: generating one or more first playback synchronization commands, wherein the first playback synchronization commands include timing information for playback of the media content; and causing transmission of fourth signals during the video call, wherein the fourth signals represent the one or more first playback synchronization commands.
- the functions further comprise: receiving one or more second playback synchronization commands; and controlling playback of the media content according to the one or more second playback synchronization commands.
- the playback synchronization commands represent at least one of: a file transfer status for the media content; a playback position for the media content; and a time of a modification of the media content by a user.
- FIG. 1 shows elements of a video calling system according to one embodiment.
- FIG. 2 shows elements of a video calling device of FIG. 1 according to one embodiment.
- FIG. 3 shows a process for the video calling system of FIG. 1 according to an embodiment where a first video call device shares local media content with a second video call device during a video call.
- FIG. 4 shows a process for the video calling system of FIG. 1 according to an embodiment where two video call devices share media content stored at a remote location during a video call.
- FIG. 5 shows a process for the video calling system of FIG. 1 according to an embodiment where a first video call device transcodes video and shared local media content for a second video call device during a video call.
- the described embodiments provide media sharing during a video call while not requiring a personal computer.
- the video calls and media sharing can be point-to-point or multi-point. These calls are not limited to video calls, and can include voice-only or video-only calls as well.
- Embodiments ensure that the shared media are rendered (that is, played back to the participants) such that the playback timing, as well as the apparent quality of the media, is nearly identical for all participants. Before describing these aspects, an example video call device is described.
- FIG. 1 shows elements of a video calling system 100 according to one embodiment. Although in the described embodiments the elements of the video calling system 100 are presented in one arrangement, other embodiments may feature other arrangements. For example, elements of the video calling system 100 can be implemented in hardware, software, or combinations thereof.
- the video calling system 100 includes N video call devices (video call device) 102 A and 102 B through 102 N connected by a network 108 .
- Network 108 can be implemented as a wide-area network such as the Internet, a local-area network (LAN), or the like. While various embodiments are described with respect to network communications, they also apply to devices employing other forms of data communications such as direct links and the like.
- each video call device 102 does not include display screens or speakers. Therefore each video call device 102 is connected to a respective television set (TV) 106 A and 106 B through 106 N. In other embodiments, one or more of the video call devices 102 includes a display screen and speakers, so one or more television sets 106 are not required. In FIG. 1 , each video call device 102 is controlled by one or more respective users, for example using one or more respective remote controls (RC) 110 .
- RC remote controls
- FIG. 2 shows elements of a video call device 102 of FIG. 1 according to one embodiment.
- the elements of video call device 102 are presented in one arrangement, other embodiments may feature other arrangements.
- elements of video call device 102 can be implemented in hardware, software, or combinations thereof.
- the video call device 102 includes an audio-visual (AV) interface (I/F) 202 , a network adapter 204 , a media interface 206 , and a remote control (RC) interface 208 .
- the video call device 102 also includes a processor or central processing unit (CPU) 210 , a graphical processing unit (GPU) 212 , a memory 214 , a coder/decoder (CODEC) 218 , a multiplexer (MUX) 220 , and a clock 222 .
- CPU central processing unit
- GPU graphical processing unit
- CODEC coder/decoder
- MUX multiplexer
- the AV interface 202 includes a video input interface (Video In) 224 , an audio input interface (Audio In) 226 , a video output interface (Video Out) 228 , and an audio output interface (Audio Out) 230 .
- the video input interface 224 can be connected to one or more video capture devices such as a camera 232 or the like. Camera 232 can be implemented as a wide-angle camera that sees the whole room.
- the audio input interface 226 can be connected to one or more audio capture device such as a microphone 234 or the like.
- Microphone 234 can be implemented as a noise-cancelling microphone.
- video call device 102 includes one or more cameras 232 and/or one or more microphones 234 .
- multiple cameras 232 can be included to generate three-dimensional (3D) video.
- multiple microphones 234 can be included so that beamforming techniques can be used to isolate conversations from background noise.
- the video output interface 228 can be connected to a display screen such as that of a television set 106 .
- the audio output interface 230 can be connected to one or more speakers such as those of a television set 106 .
- the video output interface 228 and/or the audio output interface 230 can be connected to the audio-visual inputs of a home theater system or the like.
- the video output interface 228 and the audio output interface 230 can employ any appropriate connection, for example such as Digital Visual Interface (DVI), High-Definition Multimedia Interface (HDMI), and the like.
- DVI Digital Visual Interface
- HDMI High-Definition Multimedia Interface
- the network adapter 204 includes a wireless network adapter 236 and a wired network adapter 238 .
- network adapter 204 includes additional communication interfaces, for example including Bluetooth communication interfaces and the like.
- the wireless network adapter 236 includes a transmitter (TX) 240 to transmit wireless signals and a receiver (RX) 242 to receive wireless signals, and is connected to one or more antennas 244 .
- wireless network adapter 236 is compliant with all or part of IEEE standard 802.11, including draft and approved amendments such as 802.11-1997, 802.11a, 802.11b, 802.11g, 802.11-2007, 802.11n, 802.11-2012, and 802.11ac.
- the wireless network adapter 236 can allow Wi-Fi connections, for example to a router, to other Wi-Fi devices such as smartphones and computers, and the like.
- the wired network adapter 238 includes a transmitter (TX) 246 to transmit wired signals and a receiver (RX) 248 to transmit wired signals, and is connected to a wired network interface 250 .
- wired network adapter 238 is compliant with all or part of IEEE standard 802.3, including draft and approved amendments.
- the disclosed video call devices 102 are capable of peer-to-peer (P2P) audio/video communication.
- P2P peer-to-peer
- the video call devices 102 can be connected to each other by one or more networks such that data packets can flow between them.
- the video call devices 102 can be located anywhere in the world, so long as they are connected by networks 108 such as the Internet.
- the video call devices 102 can employ multiple communication channels between participants.
- One channel carries the primary video stream of the video call.
- Another channel carries the primary audio stream of the video call.
- a command channel carries commands such as camera commands (for example, pan, tilt, and zoom) and the like.
- the command channel can also carry synchronization commands to ensure synchronized media playback across multiple sites. Additional channels can employed for other tasks such as media sharing and the like.
- P2P technologies provide multiple communication channels for each video call device 102 .
- the video call device 102 can employ the provided channels and/or channels established outside the chosen P2P technology.
- P2P technologies generally provide network address translation (NAT) traversal for their channels.
- NAT network address translation
- the video call devices 102 described herein can provide NAT traversal for channels established outside the chosen P2P technology.
- the media interface 206 receives local media content from external sources, and provides that media content to one or both of processors 210 and 212 .
- the media interface 206 includes a Secure Digital (SD) interface 252 , a Universal Serial Bus (USB) interface 254 , and a mass storage interface 216 .
- SD Secure Digital
- USB Universal Serial Bus
- Other embodiments can include other interfaces.
- the SD interface 252 receives SD cards, and provides media content stored thereon to the CPU 210 and the GPU 212 .
- the USB interface 254 receives USB devices such as USB memory sticks, USB-cabled devices, and the like, and provides media content from those devices to the CPU 210 and the GPU 212 .
- the USB interface 254 can also receive input devices such as USB dongles for wireless keyboards, wireless pointing devices, and the like.
- the mass storage interface 216 allows for connection to mass storage devices such as external solid-state drives, disk drives, and the like, and provides media content stored thereon to the CPU 210 and the GPU 212 .
- the remote control (RC) interface 208 receives wireless signals such as infrared signals from remote control devices for controlling the video call device 102 .
- the video call device 102 can be controlled by a wireless device via the wireless network adapter 236 .
- the CPU 210 handles general processing functions, while the GPU 212 handles graphic processing functions. In some embodiments, the CPU 210 handles graphic processing functions as well, so the GPU 212 is not required.
- the CPU 210 receive a time base from clock 222 .
- the memory 214 can be implemented as semiconductor memory and the like.
- the CODEC 218 provides encoding, decoding, and transcoding of the audio and video data handled by the video call device 102 .
- the CODEC 218 is compliant with one or more standards such as the H.264 standard and the like.
- the MUX 220 allows audio and video to be exchanged via the A/V interface 202 , a virtual interface 256 , or both.
- the MUX 220 allows any of the inputs and outputs to be switched with virtual inputs and outputs.
- audio and video can be provided to and/or from other local devices such as smartphones, portable cameras, document cameras, computer displays of external computers, and the like.
- the described video call devices 102 provide for sharing of arbitrary media content during the video call.
- the media content is provided by a video call device 102 .
- the media content is stored at a remote location, and a video call device 102 provides a hyperlink that indicates that location.
- the hyperlink can include a uniform resource locator (URL), Internet protocol (IP) address, or the like.
- Any type of media content can be shared Examples of media content that can be shared include photos; documents in various formats; document snapshots; screen snapshots, video files and streams, audio files and streams, and the like.
- Video call participants can provide audio and/or video commentary during the video call while sharing the media content.
- the described video call devices 102 allow users to share photos during a video call. For example, a user can prepare a playlist of photos during or before a video call. During the video call, the user can manually step through the playlist, thereby deciding the sequence and pace of sharing the photos in real time. Alternatively, the user can prepare the playlist with the desired sequence and share the photos such that the sequence of photos advances automatically. The photos can be transferred as files during the video call.
- the described video call devices 102 allow users to share documents during a video call.
- a user can choose documents in various formats for sharing.
- the formats can include text and binary formats, from simple text-only documents to documents that include text and graphics.
- the formats can include portable formats, webpage formats, and the like.
- the documents can be transferred as files during the video call.
- the described video call devices 102 allow users to share document snapshots during a video call.
- the documents can include the documents mentioned above, webpages, and the like.
- a snapshot can be an image representing a document.
- the image can be in any format.
- the snapshot can be a bit-map recording of the rendered webpage content.
- the user can use the web browser during or before the call to take snapshots of web content to share in the call.
- the document snapshots can be transferred as files during the video call.
- the described video call devices 102 allow users to share video files and streams during a video call.
- the video files and streams can be recorded using camera 232 .
- the video files and streams can be imported into the video call devices 102 from other sources, such as memory cards, other physical devices, networks such as the Internet, and the like.
- the video files and streams can be transferred as files or streamed during the video call.
- the described video call devices 102 allow users to share audio files and streams during a video call.
- the audio files and streams can be recorded using microphone 234 .
- the audio files and streams can be imported into the video call devices 102 from other sources, such as memory cards, other physical devices, networks such as the Internet, and the like.
- the audio files can be transferred as files or streamed during the video call.
- the described video call devices 102 allow users to share applications and screens during a video call. For example, during execution of an application a live computer screen can be shared as a video or one or more snapshots. In many cases, sharing an application is more useful than sharing documents produced by the application. One advantage is that the other users need not execute, or even possess, the application. Another advantage is that the document views produced by the application are the same for all users. For example, it is more useful to share a view of a small portion of a spreadsheet than to share the spreadsheet file, have all users execute the spreadsheet application, and then have all users navigate to the same place in the spreadsheet. As another example, it is simpler to share views of a secure webpage than to have all of the users log on to the web site and find the same view of the same webpage.
- the described video call devices 102 allow users to share playlists of media content.
- the playlist can include only one type of media, or multiple types of media.
- the media content can be shared consecutively, and in some cases, simultaneously.
- a playlist of photos can be accompanied by music.
- the playlist can include media files, media streams, and hyperlinks to media content. URLs to photo playlists can also be shared, such that photos are fetched and rendered in a synchronized manner for all participants during the call. The order of stepping through the media can be chosen by any participant during the call.
- the described video call devices 102 allow users to share media content using hyperlinks for the media content during a video call.
- the hyperlink can be the URL of a webpage, which is then both fetched and rendered in a synchronized manner for all participants during the video call.
- URLs to playlists can also be shared, such that playlist media content are fetched and rendered in a synchronized manner for all participants during the call.
- the order of stepping through the media content can be chosen by any participant during the call.
- the hyperlinks can be transferred as files during the video call.
- the media content shared during the video call can originate from many sources.
- a participant can insert an SD card storing a media file into the SD interface 252 of a video call device 102 .
- a participant can obtain a media file from an external device such as a smartphone, computer, or the like using the USB interface 254 of a video call device 102 , the wired network adapter 238 or wireless network adapter 236 of a video call device 102 , or the like.
- a participant can record a media file using the camera 232 of a video call device 102 .
- a participant can download a media file from the Internet into a video call device 102 .
- a participant can generate snapshots of web pages while browsing the web on a video call device 102 .
- a participant can share a URL for media files stored in a network such as the Internet.
- other sources can be used.
- the described video call devices 102 also provide synchronized media playback such that all participants of a video call have an almost identical contemporaneous experience while playing the media content.
- media files can be downloaded or streamed from the Internet to video call devices 102 or transmitted from one video call device 102 to another with the goal of playing them back in a synchronized manner.
- the video call devices 102 employ synchronization algorithms to keep track of what data are available at what video call devices 102 at a certain time. To minimize latencies, some files are transmitted (or fetched) ahead of time. Data are pre-fetched (or pre-transmitted) in anticipation that those data will be required as the playback advances.
- the synchronization algorithms also consider how much network throughput is available, while not adversely affecting the audio/video packet flow.
- Synchronization scenarios are described below.
- sharing files such as photos
- synchronization involves determining when all participants have obtained the file.
- sharing streams and video files synchronization involves matching playback positions in the streams.
- the media can originate not only from video call participants' end-points, but also from remote sites such as Internet servers. In the latter case, each participant's video call device 102 fetches the media using the same URLs in a synchronized manner such that the media can be played back in a synchronized manner.
- other events can be synchronized during a video call.
- the modification can be synchronized so that it is rendered simultaneously for all video call participants.
- Example modifications can include rotating an image after it is shared, changing the cursor position in a video (jogging), zooming in on an image or video, marking up the shared media content, and the like.
- FIG. 3 shows a process 300 for the video calling system 100 of FIG. 1 according to an embodiment where a first video call device 102 A shares local media content with a second video call device 102 B during a video call.
- the elements of process 300 are presented in one arrangement, other embodiments may feature other arrangements.
- some or all of the elements of process 300 can be executed in a different order, concurrently, and the like.
- some elements of process 300 may not be performed, and may not be executed immediately after each other.
- only two video call devices 102 A,B are shown in FIG. 3 . However, it should be understood that more than two video call devices 102 can participate in process 300 .
- the video call devices 102 A,B conduct a video call.
- the first video call device 102 A receives first audio information and first video information AV 1 at 302 , for example from a local camera 232 and microphone 234 .
- the second video call device 102 B receives second audio information and second video information AV 2 at 304 .
- the video call devices 102 A,B exchange the first and second audio and video information AV 1 and AV 2 at 306 .
- the first video call device 102 A renders the second audio and video information AV 2 at 308 , for example on television set 106 A.
- the second video call device 102 B renders the first audio and video information AV 1 at 310 , for example on television set 106 B. This exchange can continue for the remainder of process 300 .
- the first video call device 102 A receives local media content.
- the first video call device 102 A can receive a photo stored on an SD card inserted in SD interface 252 .
- the first video call device 102 A sends the media content to the second video call device 102 B. While the media content is being transferred, the displays at both ends of the video call show the status of the transfer.
- the video call devices 102 exchange one or more synchronization commands over the command channel to indicate completion of the transfer.
- the processor 210 generates one or more synchronization commands, and either transmitter 240 or transmitter 246 transmit signals representing the one or more synchronization commands.
- either receiver 242 or receiver 248 receive the signals representing the one or more synchronization commands, and processor 210 controls the playback of the media content according to the one or more synchronization commands.
- the video call devices 102 render the media content at the same time.
- the video call devices 102 render a photo simultaneously.
- the video call devices 102 begin playback of a video file simultaneously.
- video call device 102 A renders the media content at 318
- video call device 102 B renders the media content at 320 .
- a two-way exchange of synchronization commands is shown.
- a single synchronization command can be sent.
- the second video call device 102 B can send a single synchronization command to the first video call device 102 A when the transfer of the media content to the second video call device 102 B is complete, is sufficiently complete to begin playback, or the like.
- rendering of a video can begin when a sufficient amount of the video has been transferred rather than waiting for the transfer to complete.
- the media content includes multiple photos
- rendering of one photo can begin while subsequent photos are being transferred.
- FIG. 4 shows a process 400 for the video calling system 100 of FIG. 1 according to an embodiment where two video call devices 102 A,B share media content stored at a remote location during a video call.
- the elements of process 400 are presented in one arrangement, other embodiments may feature other arrangements.
- some or all of the elements of process 400 can be executed in a different order, concurrently, and the like.
- some elements of process 400 may not be performed, and may not be executed immediately after each other.
- only two video call devices 102 A,B are shown in FIG. 4 . However, it should be understood that more than two video call devices 102 can participate in process 400 .
- the video call devices 102 A,B conduct a video call.
- the first video call device 102 A receives first audio information and first video information AV 1 at 402 , for example from a local camera 232 and microphone 234 .
- the second video call device 102 B receives second audio information and second video information AV 2 at 404 .
- the video call devices 102 A,B exchange the first and second audio and video information AV 1 and AV 2 at 406 .
- the first video call device 102 A renders the second audio and video information AV 2 at 408 , for example on television set 106 A.
- the second video call device 102 B renders the first audio and video information AV 1 at 410 , for example on television set 106 B. This exchange can continue for the remainder of process 400 .
- the first video call device 102 A receives a hyperlink such as a URL.
- a hyperlink such as a URL.
- a user of the first video call device 102 A can input the hyperlink using a remote control or wireless keyboard.
- the first video call device 102 A sends the hyperlink to the second video call device 102 B.
- Each video call device 102 uses the hyperlink to independently get the media content from a network server 430 indicated by the hyperlink.
- the first video call device 102 A uses the hyperlink to get the media content
- the second video call device 102 B uses the hyperlink to get the media content.
- the video call devices 102 exchange one or more synchronization commands over the command channel to indicate completion of the transfer.
- the video call devices 102 render the media content at the same time.
- the video call devices 102 render a photo simultaneously.
- the video call devices 102 begin playback of a video file simultaneously.
- video call device 102 A renders the media content at 422
- video call device 102 B renders the media content at 424 .
- the described video call devices 102 preserve the apparent quality of media shared during a video call.
- One illustrative example is the case of photo sharing.
- a photo slideshow is treated as a video. That is, the photos are compressed using a video compression technique and streamed to other participants.
- this approach introduces undesirable video compression artifacts. For example, each photo appears blurry at first, and then is enhanced over time.
- hardware video encoders are often forced to insert a key frame (also called intra frame or intra-coded frame or I-frame) at regular intervals. In case of a photo, this is equivalent to communicating the photo again from scratch. This might lead to the photo's appearance going through repeated cycles of blurry-to-crisp and blurry again.
- described embodiments send the photo files to all participants so the photos appear the same to all participants, including the sender.
- the described video call devices 102 also consider the resolution of the display devices employed by video call participants, both for the primary video stream and for media sharing. During setup of a video call, each video call device 102 informs the other video call devices 102 of its display device resolution. During the video call, the video call devices 102 employ transcoding to generate video streams of appropriate resolutions, thereby preserving video quality while reducing bandwidth usage.
- FIG. 5 shows a process 500 for the video calling system 100 of FIG. 1 according to an embodiment where a first video call device 102 A transcodes video and shared local media content for a second video call device 102 B during a video call.
- the elements of process 500 are presented in one arrangement, other embodiments may feature other arrangements.
- some or all of the elements of process 500 can be executed in a different order, concurrently, and the like.
- some elements of process 500 may not be performed, and may not be executed immediately after each other.
- only two video call devices 102 A,B are shown in FIG. 5 . However, it should be understood that more than two video call devices 102 can participate in process 500 .
- the second video call device 102 B sends a link partner media quality request to the first video call device 102 A.
- the link partner media quality request indicates a desired quality for video call video and the media content.
- the link partner media quality request can indicate the resolution of the display device connected to the second video call device 102 B.
- the video call devices 102 A,B then conduct a video call.
- the first video call device 102 A receives first audio information and first video information AV 1 at 504 , for example from a local camera 232 and microphone 234 .
- the second video call device 102 B receives second audio information and second video information AV 2 at 506 .
- the video call devices 102 A,B exchange the first and second audio and video information AV 1 and AV 2 at 510 .
- the first video call device 102 A renders the second audio and video information AV 2 at 512 , for example on television set 106 A.
- the second video call device 102 B renders the first audio and video information AV 1 at 514 , for example on television set 106 B.
- the first video call device 102 A transcodes the first video information according to the desired quality at 508 prior to sending the first video information to the second video call device 102 B.
- CODEC 218 of the first video call device 102 A transcodes the first video information. This exchange can continue for the remainder of process 500 .
- the first video call device 102 A receives local media content.
- the first video call device 102 A can receive a photo stored on an SD card inserted in SD interface 252 .
- the first video call device 102 A transcodes the media content according to the desired quality prior to sending the media content to the second video call device 102 B.
- CODEC 218 of the first video call device 102 A transcodes the media content.
- the first video call device 102 A sends the transcoded media content to the second video call device 102 B. While the media content is being transferred, the displays at both ends of the video call show the status of the transfer.
- the video call devices 102 exchange one or more synchronization commands over the command channel to indicate completion of the transfer.
- the video call devices 102 render the media content at the same time.
- the video call devices 102 render a photo simultaneously.
- the video call devices 102 begin playback of a video file simultaneously.
- video call device 102 A renders the media content at 524
- video call device 102 B renders the transcoded media content at 526 .
- a two-way exchange of synchronization commands is shown.
- a single synchronization command can be sent.
- the second video call device 102 B can send a single synchronization command to the first video call device 102 A when the transfer of the transcoded media content to the second video call device 102 B is complete, is sufficiently complete to begin playback, or the like.
- Embodiments of the disclosure can be implemented in digital electronic circuitry, or in computer hardware, firmware, software, or in combinations of them.
- Embodiments of the disclosure can be implemented in a computer program product tangibly embodied in a machine-readable storage device for execution by a programmable processor; and method steps of the disclosure can be performed by a programmable processor executing a program of instructions to perform functions of the disclosure by operating on input data and generating output.
- the disclosure can be implemented advantageously in one or more computer programs that are executable on a programmable system including at least one programmable processor coupled to receive data and instructions from, and to transmit data and instructions to, a data storage system, at least one input device, and at least one output device.
- Each computer program can be implemented in a high-level procedural or object-oriented programming language, or in assembly or machine language if desired; and in any case, the language can be a compiled or interpreted language.
- Suitable processors include, by way of example, both general and special purpose microprocessors.
- a processor will receive instructions and data from a read-only memory and/or a random access memory.
- a computer will include one or more mass storage devices for storing data files; such devices include magnetic disks, such as internal hard disks and removable disks; magneto-optical disks; and optical disks.
- Storage devices suitable for tangibly embodying computer program instructions and data include all forms of non-volatile memory, including by way of example semiconductor memory devices, such as EPROM, EEPROM, and flash memory devices; magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM disks. Any of the foregoing can be supplemented by, or incorporated in, ASICs (application-specific integrated circuits).
- semiconductor memory devices such as EPROM, EEPROM, and flash memory devices
- magnetic disks such as internal hard disks and removable disks
- magneto-optical disks magneto-optical disks
- CD-ROM disks CD-ROM disks
Abstract
Video call devices having corresponding methods and non-transitory computer-readable media comprise: a video input interface configured to receive first video information; an audio input interface configured to receive first audio information; a transmitter configured to transmit first signals during a video call, wherein the first signals represent the first video information and the first audio information; a receiver configured to receive second signals during the video call, wherein the second signals represent second video information and second audio information; a video output interface configured to provide the second video information; an audio output interface configured to provide the second audio information; wherein the transmitter is further configured to transmit third signals during the video call, wherein the third signals represent at least one of media content, and a hyperlink, wherein the hyperlink indicates a location of the media content.
Description
- This application claims benefit of U.S. Provisional Patent Application Ser. No. 61/485,229 entitled “MEDIA SHARING DURING A VIDEO CALL,” filed May 12, 2011, the disclosure thereof incorporated by reference herein in its entirety.
- This application claims benefit of U.S. Provisional Patent Application Ser. No. 61/485,233 entitled “WIRELESS NETWORK DEVICE CONFIGURATION USING TWO-DIMENSIONAL PATTERNS,” filed May 12, 2011, the disclosure thereof incorporated by reference herein in its entirety.
- This application claims benefit of U.S. Provisional Patent Application Ser. No. 61/485,237 entitled “SMART REMOTE CONTROL DEVICES FOR VIDEO CALLING,” filed May 12, 2011, the disclosure thereof incorporated by reference herein in its entirety.
- This application is related to U.S. Patent Application Serial No. (to be assigned, Attorney Docket No. TLY003001), entitled “WIRELESS NETWORK DEVICE CONFIGURATION USING TWO-DIMENSIONAL PATTERNS,” filed TBD, the disclosure thereof incorporated by reference herein in its entirety.
- This application is related to U.S. Patent Application Serial No. (to be assigned, Attorney Docket No. TLY004001), entitled “SMART REMOTE CONTROL DEVICES FOR CONTROLLING VIDEO CALL DEVICES,” filed TBD, the disclosure thereof incorporated by reference herein in its entirety.
- The present disclosure relates generally to video calling. More particularly, the present disclosure relates to sharing media during a video call.
- The traditional use of television has been for passive consumption of content. The content is mostly television programming (live as well as on-demand) and outputs of other local devices such as media players (for example, DVD, CD, and VCR devices), video game devices, and the like. But despite the availability of large, high-resolution television screens, few solutions allow the use of these television screens for video calling. One solution involves connecting a computer to a webcam, speakers, microphone, and the television screen, installing and executing video calling software on the computer, and controlling the computer using a keyboard and mouse. Users generally avoid such tedious tasks.
- Similar problems plague interactive media sharing such as sharing photos online. In order to share photos, most people prepare a web album, and email links to those albums to other parties. However, this technique does not allow the involved parties to step through the album in a synchronized manner while sharing verbal comments and the like. In addition, the steps of uploading photo albums and emailing links to the albums generally require a web browser or other dedicated software and a computer.
- Other solutions involve “pure” screen sharing. According to these solutions, one user's screen is compressed and transported to other participants. Such screen sharing often suffers from video compression artifacts, especially if the available data rate for communication fluctuates. Furthermore, such solutions fail to deliver media to all participants in their native, pristine format. These solutions also fail to provide synchronized playback such that all users have an almost identical contemporaneous experience while playing the media.
- In general, in one aspect, an embodiment features a video call device comprising: a video input interface configured to receive first video information; an audio input interface configured to receive first audio information; a transmitter configured to transmit first signals during a video call, wherein the first signals represent the first video information and the first audio information; a receiver configured to receive second signals during the video call, wherein the second signals represent second video information and second audio information; a video output interface configured to provide the second video information; an audio output interface configured to provide the second audio information; wherein the transmitter is further configured to transmit third signals during the video call, wherein the third signals represent at least one of media content, and a hyperlink, wherein the hyperlink indicates a location of the media content.
- Embodiments of the video call device can include one or more of the following features. Some embodiments comprise an encoder/decoder (CODEC); wherein the receiver is further configured to receive fourth signals, wherein the fourth signals represent a link partner media quality request, wherein the link partner media quality request indicates a desired quality for the first video information; wherein the CODEC is configured to transcode the first video information according to the desired quality prior to the first signals being transmitted by the transmitter. Some embodiments comprise a media interface configured to receive the media content; wherein the third signals represent the media content. Some embodiments comprise an encoder/decoder (CODEC); wherein the receiver is further configured to receive fourth signals, wherein the fourth signals represent a link partner media quality request, wherein the link partner media quality request indicates a desired quality for the media content; wherein the CODEC is configured to transcode the media content according to the desired quality prior to the third signals being transmitted by the transmitter. In some embodiments, the media interface comprises at least one of: an SD card interface; a USB interface; and a mass storage interface. Some embodiments comprise a processor configured to generate one or more first playback synchronization commands, wherein the first playback synchronization commands include timing information for playback of the media content; wherein the transmitter is further configured to transmit fourth signals during the video call, wherein the fourth signals represent the one or more first playback synchronization commands. In some embodiments, the receiver is further configured to receive fifth signals during the video call, wherein the fifth signals represent one or more second playback synchronization commands; and the processor is further configured to control playback of the media content according to the one or more second playback synchronization commands. In some embodiments, the playback synchronization commands represent at least one of: a file transfer status for the media content; a playback position for the media content; and a time of a modification of the media content by a user. Some embodiments comprise one or more cameras configured to provide the first video information to the video input interface; and one or more microphones configured to provide the first audio information to the audio input interface.
- In general, in one aspect, an embodiment features a method comprising: receiving first video information; receiving first audio information; transmitting first signals during a video call, wherein the first signals represent the first video information and the first audio information; receiving second signals during the video call, wherein the second signals represent second video information and second audio information; providing the second video information; providing the second audio information; and transmitting third signals during the video call, wherein the third signals represent at least one of media content, and a hyperlink, wherein the hyperlink indicates a location of the media content.
- Embodiments of the method can include one or more of the following features. Some embodiments comprise receiving fourth signals, wherein the fourth signals represent a link partner media quality request, wherein the link partner media quality request indicates a desired quality for the first video information; and transcoding the first video information according to the desired quality prior to transmitting the first signals. Some embodiments comprise receiving the media content; wherein the third signals represent the media content. Some embodiments comprise receiving fourth signals, wherein the fourth signals represent a link partner media quality request, wherein the link partner media quality request indicates a desired quality for the media content; and transcoding the media content according to the desired quality prior to transmitting the third signals. Some embodiments comprise generating one or more first playback synchronization commands, wherein the first playback synchronization commands include timing information for playback of the media content; and transmitting fourth signals during the video call, wherein the fourth signals represent the one or more first playback synchronization commands. Some embodiments comprise receiving fifth signals during the video call, wherein the fifth signals represent one or more second playback synchronization commands; and controlling playback of the media content according to the one or more second playback synchronization commands. In some embodiments, the playback synchronization commands represent at least one of: a file transfer status for the media content; a playback position for the media content; and a time of a modification of the media content by a user.
- In general, in one aspect, an embodiment features non-transitory computer-readable media embodying instructions executable by a computer to perform functions comprising: receiving first video information and first audio information; causing transmission of first signals during a video call, wherein the first signals represent the first video information and the first audio information; providing second video information and second audio information based on second signals received during the video call, wherein the second signals represent the second video information and the second audio information; causing transmission of third signals during the video call, wherein the third signals represent at least one of media content, and a hyperlink, wherein the hyperlink indicates a location of the media content.
- Embodiments of the non-transitory computer-readable media can include one or more of the following features. In some embodiments, the functions further comprise: receiving a link partner media quality request, wherein the link partner media quality request indicates a desired quality for the first video information; and transcoding the first video information according to the desired quality prior to causing transmission of the first signals. In some embodiments, the functions further comprise: receiving the media content; wherein the third signals represent the media content. In some embodiments, the functions further comprise: receiving a link partner media quality request, wherein the link partner media quality request indicates a desired quality for the media content; and transcoding the media content according to the desired quality prior to causing transmission of the third signals. In some embodiments, the functions further comprise: generating one or more first playback synchronization commands, wherein the first playback synchronization commands include timing information for playback of the media content; and causing transmission of fourth signals during the video call, wherein the fourth signals represent the one or more first playback synchronization commands. In some embodiments, the functions further comprise: receiving one or more second playback synchronization commands; and controlling playback of the media content according to the one or more second playback synchronization commands. In some embodiments, the playback synchronization commands represent at least one of: a file transfer status for the media content; a playback position for the media content; and a time of a modification of the media content by a user.
- The details of one or more implementations are set forth in the accompanying drawings and the description below. Other features will be apparent from the description and drawings, and from the claims.
-
FIG. 1 shows elements of a video calling system according to one embodiment. -
FIG. 2 shows elements of a video calling device ofFIG. 1 according to one embodiment. -
FIG. 3 shows a process for the video calling system ofFIG. 1 according to an embodiment where a first video call device shares local media content with a second video call device during a video call. -
FIG. 4 shows a process for the video calling system ofFIG. 1 according to an embodiment where two video call devices share media content stored at a remote location during a video call. -
FIG. 5 shows a process for the video calling system ofFIG. 1 according to an embodiment where a first video call device transcodes video and shared local media content for a second video call device during a video call. - The leading digit(s) of each reference numeral used in this specification indicates the number of the drawing in which the reference numeral first appears.
- The described embodiments provide media sharing during a video call while not requiring a personal computer. The video calls and media sharing can be point-to-point or multi-point. These calls are not limited to video calls, and can include voice-only or video-only calls as well. Embodiments ensure that the shared media are rendered (that is, played back to the participants) such that the playback timing, as well as the apparent quality of the media, is nearly identical for all participants. Before describing these aspects, an example video call device is described.
- Video Call Device
-
FIG. 1 shows elements of avideo calling system 100 according to one embodiment. Although in the described embodiments the elements of thevideo calling system 100 are presented in one arrangement, other embodiments may feature other arrangements. For example, elements of thevideo calling system 100 can be implemented in hardware, software, or combinations thereof. - Referring to
FIG. 1 thevideo calling system 100 includes N video call devices (video call device) 102A and 102B through 102N connected by a network 108. Network 108 can be implemented as a wide-area network such as the Internet, a local-area network (LAN), or the like. While various embodiments are described with respect to network communications, they also apply to devices employing other forms of data communications such as direct links and the like. - In the embodiment of
FIG. 1 , thevideo call devices 102 do not include display screens or speakers. Therefore eachvideo call device 102 is connected to a respective television set (TV) 106A and 106B through 106N. In other embodiments, one or more of thevideo call devices 102 includes a display screen and speakers, so one ormore television sets 106 are not required. InFIG. 1 , eachvideo call device 102 is controlled by one or more respective users, for example using one or more respective remote controls (RC) 110. -
FIG. 2 shows elements of avideo call device 102 ofFIG. 1 according to one embodiment. Although in the described embodiments the elements ofvideo call device 102 are presented in one arrangement, other embodiments may feature other arrangements. For example, elements ofvideo call device 102 can be implemented in hardware, software, or combinations thereof. - Referring to
FIG. 2 , thevideo call device 102 includes an audio-visual (AV) interface (I/F) 202, anetwork adapter 204, amedia interface 206, and a remote control (RC)interface 208. Thevideo call device 102 also includes a processor or central processing unit (CPU) 210, a graphical processing unit (GPU) 212, amemory 214, a coder/decoder (CODEC) 218, a multiplexer (MUX) 220, and aclock 222. - The
AV interface 202 includes a video input interface (Video In) 224, an audio input interface (Audio In) 226, a video output interface (Video Out) 228, and an audio output interface (Audio Out) 230. Thevideo input interface 224 can be connected to one or more video capture devices such as acamera 232 or the like.Camera 232 can be implemented as a wide-angle camera that sees the whole room. Theaudio input interface 226 can be connected to one or more audio capture device such as amicrophone 234 or the like.Microphone 234 can be implemented as a noise-cancelling microphone. In some embodiments,video call device 102 includes one ormore cameras 232 and/or one ormore microphones 234. For example,multiple cameras 232 can be included to generate three-dimensional (3D) video. As another example,multiple microphones 234 can be included so that beamforming techniques can be used to isolate conversations from background noise. - The
video output interface 228 can be connected to a display screen such as that of atelevision set 106. Theaudio output interface 230 can be connected to one or more speakers such as those of atelevision set 106. Alternatively, thevideo output interface 228 and/or theaudio output interface 230 can be connected to the audio-visual inputs of a home theater system or the like. Thevideo output interface 228 and theaudio output interface 230 can employ any appropriate connection, for example such as Digital Visual Interface (DVI), High-Definition Multimedia Interface (HDMI), and the like. - The
network adapter 204 includes awireless network adapter 236 and awired network adapter 238. In some embodiments,network adapter 204 includes additional communication interfaces, for example including Bluetooth communication interfaces and the like. - The
wireless network adapter 236 includes a transmitter (TX) 240 to transmit wireless signals and a receiver (RX) 242 to receive wireless signals, and is connected to one ormore antennas 244. In some embodiments,wireless network adapter 236 is compliant with all or part of IEEE standard 802.11, including draft and approved amendments such as 802.11-1997, 802.11a, 802.11b, 802.11g, 802.11-2007, 802.11n, 802.11-2012, and 802.11ac. For example, thewireless network adapter 236 can allow Wi-Fi connections, for example to a router, to other Wi-Fi devices such as smartphones and computers, and the like. - The
wired network adapter 238 includes a transmitter (TX) 246 to transmit wired signals and a receiver (RX) 248 to transmit wired signals, and is connected to awired network interface 250. In some embodiments, wirednetwork adapter 238 is compliant with all or part of IEEE standard 802.3, including draft and approved amendments. - The disclosed
video call devices 102 are capable of peer-to-peer (P2P) audio/video communication. Using P2P technology, twovideo call devices 102 can be connected to each other by one or more networks such that data packets can flow between them. Thevideo call devices 102 can be located anywhere in the world, so long as they are connected by networks 108 such as the Internet. Thevideo call devices 102 can employ multiple communication channels between participants. One channel carries the primary video stream of the video call. Another channel carries the primary audio stream of the video call. A command channel carries commands such as camera commands (for example, pan, tilt, and zoom) and the like. The command channel can also carry synchronization commands to ensure synchronized media playback across multiple sites. Additional channels can employed for other tasks such as media sharing and the like. - Some available P2P technologies provide multiple communication channels for each
video call device 102. Thevideo call device 102 can employ the provided channels and/or channels established outside the chosen P2P technology. P2P technologies generally provide network address translation (NAT) traversal for their channels. Thevideo call devices 102 described herein can provide NAT traversal for channels established outside the chosen P2P technology. - The
media interface 206 receives local media content from external sources, and provides that media content to one or both ofprocessors FIG. 2 , themedia interface 206 includes a Secure Digital (SD)interface 252, a Universal Serial Bus (USB)interface 254, and amass storage interface 216. Other embodiments can include other interfaces. - The
SD interface 252 receives SD cards, and provides media content stored thereon to theCPU 210 and theGPU 212. TheUSB interface 254 receives USB devices such as USB memory sticks, USB-cabled devices, and the like, and provides media content from those devices to theCPU 210 and theGPU 212. TheUSB interface 254 can also receive input devices such as USB dongles for wireless keyboards, wireless pointing devices, and the like. Themass storage interface 216 allows for connection to mass storage devices such as external solid-state drives, disk drives, and the like, and provides media content stored thereon to theCPU 210 and theGPU 212. - The remote control (RC)
interface 208 receives wireless signals such as infrared signals from remote control devices for controlling thevideo call device 102. In some embodiments, thevideo call device 102 can be controlled by a wireless device via thewireless network adapter 236. - The
CPU 210 handles general processing functions, while theGPU 212 handles graphic processing functions. In some embodiments, theCPU 210 handles graphic processing functions as well, so theGPU 212 is not required. TheCPU 210 receive a time base fromclock 222. Thememory 214 can be implemented as semiconductor memory and the like. - The
CODEC 218 provides encoding, decoding, and transcoding of the audio and video data handled by thevideo call device 102. In some embodiments, theCODEC 218 is compliant with one or more standards such as the H.264 standard and the like. - The
MUX 220 allows audio and video to be exchanged via the A/V interface 202, avirtual interface 256, or both. TheMUX 220 allows any of the inputs and outputs to be switched with virtual inputs and outputs. For example, audio and video can be provided to and/or from other local devices such as smartphones, portable cameras, document cameras, computer displays of external computers, and the like. - Media Sharing
- In some embodiments, the described
video call devices 102 provide for sharing of arbitrary media content during the video call. In some cases the media content is provided by avideo call device 102. In other cases, the media content is stored at a remote location, and avideo call device 102 provides a hyperlink that indicates that location. The hyperlink can include a uniform resource locator (URL), Internet protocol (IP) address, or the like. Any type of media content can be shared Examples of media content that can be shared include photos; documents in various formats; document snapshots; screen snapshots, video files and streams, audio files and streams, and the like. Video call participants can provide audio and/or video commentary during the video call while sharing the media content. - The described
video call devices 102 allow users to share photos during a video call. For example, a user can prepare a playlist of photos during or before a video call. During the video call, the user can manually step through the playlist, thereby deciding the sequence and pace of sharing the photos in real time. Alternatively, the user can prepare the playlist with the desired sequence and share the photos such that the sequence of photos advances automatically. The photos can be transferred as files during the video call. - The described
video call devices 102 allow users to share documents during a video call. For example, a user can choose documents in various formats for sharing. The formats can include text and binary formats, from simple text-only documents to documents that include text and graphics. The formats can include portable formats, webpage formats, and the like. The documents can be transferred as files during the video call. - The described
video call devices 102 allow users to share document snapshots during a video call. The documents can include the documents mentioned above, webpages, and the like. A snapshot can be an image representing a document. The image can be in any format. For example, in the case of a webpage, the snapshot can be a bit-map recording of the rendered webpage content. The user can use the web browser during or before the call to take snapshots of web content to share in the call. The document snapshots can be transferred as files during the video call. - The described
video call devices 102 allow users to share video files and streams during a video call. The video files and streams can be recorded usingcamera 232. Alternatively, the video files and streams can be imported into thevideo call devices 102 from other sources, such as memory cards, other physical devices, networks such as the Internet, and the like. The video files and streams can be transferred as files or streamed during the video call. - The described
video call devices 102 allow users to share audio files and streams during a video call. The audio files and streams can be recorded usingmicrophone 234. Alternatively, the audio files and streams can be imported into thevideo call devices 102 from other sources, such as memory cards, other physical devices, networks such as the Internet, and the like. The audio files can be transferred as files or streamed during the video call. - The described
video call devices 102 allow users to share applications and screens during a video call. For example, during execution of an application a live computer screen can be shared as a video or one or more snapshots. In many cases, sharing an application is more useful than sharing documents produced by the application. One advantage is that the other users need not execute, or even possess, the application. Another advantage is that the document views produced by the application are the same for all users. For example, it is more useful to share a view of a small portion of a spreadsheet than to share the spreadsheet file, have all users execute the spreadsheet application, and then have all users navigate to the same place in the spreadsheet. As another example, it is simpler to share views of a secure webpage than to have all of the users log on to the web site and find the same view of the same webpage. - The described
video call devices 102 allow users to share playlists of media content. The playlist can include only one type of media, or multiple types of media. The media content can be shared consecutively, and in some cases, simultaneously. For example, a playlist of photos can be accompanied by music. The playlist can include media files, media streams, and hyperlinks to media content. URLs to photo playlists can also be shared, such that photos are fetched and rendered in a synchronized manner for all participants during the call. The order of stepping through the media can be chosen by any participant during the call. - The described
video call devices 102 allow users to share media content using hyperlinks for the media content during a video call. For example, the hyperlink can be the URL of a webpage, which is then both fetched and rendered in a synchronized manner for all participants during the video call. URLs to playlists can also be shared, such that playlist media content are fetched and rendered in a synchronized manner for all participants during the call. The order of stepping through the media content can be chosen by any participant during the call. The hyperlinks can be transferred as files during the video call. - The media content shared during the video call can originate from many sources. For example, a participant can insert an SD card storing a media file into the
SD interface 252 of avideo call device 102. A participant can obtain a media file from an external device such as a smartphone, computer, or the like using theUSB interface 254 of avideo call device 102, thewired network adapter 238 orwireless network adapter 236 of avideo call device 102, or the like. A participant can record a media file using thecamera 232 of avideo call device 102. A participant can download a media file from the Internet into avideo call device 102. A participant can generate snapshots of web pages while browsing the web on avideo call device 102. A participant can share a URL for media files stored in a network such as the Internet. Of course, other sources can be used. - Synchronized Media Playback
- In some embodiments, the described
video call devices 102 also provide synchronized media playback such that all participants of a video call have an almost identical contemporaneous experience while playing the media content. During a video call, media files can be downloaded or streamed from the Internet tovideo call devices 102 or transmitted from onevideo call device 102 to another with the goal of playing them back in a synchronized manner. Thevideo call devices 102 employ synchronization algorithms to keep track of what data are available at whatvideo call devices 102 at a certain time. To minimize latencies, some files are transmitted (or fetched) ahead of time. Data are pre-fetched (or pre-transmitted) in anticipation that those data will be required as the playback advances. The synchronization algorithms also consider how much network throughput is available, while not adversely affecting the audio/video packet flow. - Synchronization scenarios are described below. In sharing files such as photos, synchronization involves determining when all participants have obtained the file. In sharing streams and video files, synchronization involves matching playback positions in the streams.
- The media can originate not only from video call participants' end-points, but also from remote sites such as Internet servers. In the latter case, each participant's
video call device 102 fetches the media using the same URLs in a synchronized manner such that the media can be played back in a synchronized manner. - In some embodiments, other events can be synchronized during a video call. For example, when a video call participant modifies the shared media content, the modification can be synchronized so that it is rendered simultaneously for all video call participants. Example modifications can include rotating an image after it is shared, changing the cursor position in a video (jogging), zooming in on an image or video, marking up the shared media content, and the like.
-
FIG. 3 shows aprocess 300 for thevideo calling system 100 ofFIG. 1 according to an embodiment where a firstvideo call device 102A shares local media content with a secondvideo call device 102B during a video call. Although in the described embodiments the elements ofprocess 300 are presented in one arrangement, other embodiments may feature other arrangements. For example, in various embodiments, some or all of the elements ofprocess 300 can be executed in a different order, concurrently, and the like. Also some elements ofprocess 300 may not be performed, and may not be executed immediately after each other. For clarity, only twovideo call devices 102A,B are shown inFIG. 3 . However, it should be understood that more than twovideo call devices 102 can participate inprocess 300. - Referring to
FIG. 3 , thevideo call devices 102A,B conduct a video call. In particular, the firstvideo call device 102A receives first audio information and first video information AV1 at 302, for example from alocal camera 232 andmicrophone 234. The secondvideo call device 102B receives second audio information and second video information AV2 at 304. Thevideo call devices 102A,B exchange the first and second audio and video information AV1 and AV2 at 306. The firstvideo call device 102A renders the second audio and video information AV2 at 308, for example ontelevision set 106A. The secondvideo call device 102B renders the first audio and video information AV1 at 310, for example ontelevision set 106B. This exchange can continue for the remainder ofprocess 300. - At 312, the first
video call device 102A receives local media content. For example, the firstvideo call device 102A can receive a photo stored on an SD card inserted inSD interface 252. At 314, the firstvideo call device 102A sends the media content to the secondvideo call device 102B. While the media content is being transferred, the displays at both ends of the video call show the status of the transfer. - At 316, when the media content is available at
video call device 102B, thevideo call devices 102 exchange one or more synchronization commands over the command channel to indicate completion of the transfer. For example, at onevideo call device 102, theprocessor 210 generates one or more synchronization commands, and eithertransmitter 240 ortransmitter 246 transmit signals representing the one or more synchronization commands. At the othervideo call device 102, eitherreceiver 242 orreceiver 248 receive the signals representing the one or more synchronization commands, andprocessor 210 controls the playback of the media content according to the one or more synchronization commands. In response to the commands, thevideo call devices 102 render the media content at the same time. For example, thevideo call devices 102 render a photo simultaneously. As another example, thevideo call devices 102 begin playback of a video file simultaneously. In particular,video call device 102A renders the media content at 318, andvideo call device 102B renders the media content at 320. - In
FIG. 3 , a two-way exchange of synchronization commands is shown. In other embodiments, a single synchronization command can be sent. For example, the secondvideo call device 102B can send a single synchronization command to the firstvideo call device 102A when the transfer of the media content to the secondvideo call device 102B is complete, is sufficiently complete to begin playback, or the like. For example, rendering of a video can begin when a sufficient amount of the video has been transferred rather than waiting for the transfer to complete. As another example, when the media content includes multiple photos, rendering of one photo can begin while subsequent photos are being transferred. -
FIG. 4 shows aprocess 400 for thevideo calling system 100 ofFIG. 1 according to an embodiment where twovideo call devices 102A,B share media content stored at a remote location during a video call. Although in the described embodiments the elements ofprocess 400 are presented in one arrangement, other embodiments may feature other arrangements. For example, in various embodiments, some or all of the elements ofprocess 400 can be executed in a different order, concurrently, and the like. Also some elements ofprocess 400 may not be performed, and may not be executed immediately after each other. For clarity, only twovideo call devices 102A,B are shown inFIG. 4 . However, it should be understood that more than twovideo call devices 102 can participate inprocess 400. - Referring to
FIG. 4 , thevideo call devices 102A,B conduct a video call. In particular, the firstvideo call device 102A receives first audio information and first video information AV1 at 402, for example from alocal camera 232 andmicrophone 234. The secondvideo call device 102B receives second audio information and second video information AV2 at 404. Thevideo call devices 102A,B exchange the first and second audio and video information AV1 and AV2 at 406. The firstvideo call device 102A renders the second audio and video information AV2 at 408, for example ontelevision set 106A. The secondvideo call device 102B renders the first audio and video information AV1 at 410, for example ontelevision set 106B. This exchange can continue for the remainder ofprocess 400. - At 412, the first
video call device 102A receives a hyperlink such as a URL. For example, a user of the firstvideo call device 102A can input the hyperlink using a remote control or wireless keyboard. At 414, the firstvideo call device 102A sends the hyperlink to the secondvideo call device 102B. Eachvideo call device 102 uses the hyperlink to independently get the media content from anetwork server 430 indicated by the hyperlink. In particular, at 416 the firstvideo call device 102A uses the hyperlink to get the media content, and at 418 the secondvideo call device 102B uses the hyperlink to get the media content. - At 420, when the media content is available at both
video call devices 102A,B, thevideo call devices 102 exchange one or more synchronization commands over the command channel to indicate completion of the transfer. In response to the commands, thevideo call devices 102 render the media content at the same time. For example, thevideo call devices 102 render a photo simultaneously. As another example, thevideo call devices 102 begin playback of a video file simultaneously. In particular,video call device 102A renders the media content at 422, andvideo call device 102B renders the media content at 424. - Media Quality Preservation
- In some embodiments, the described
video call devices 102 preserve the apparent quality of media shared during a video call. One illustrative example is the case of photo sharing. In some current photo sharing approaches, a photo slideshow is treated as a video. That is, the photos are compressed using a video compression technique and streamed to other participants. However, this approach introduces undesirable video compression artifacts. For example, each photo appears blurry at first, and then is enhanced over time. As another example, hardware video encoders are often forced to insert a key frame (also called intra frame or intra-coded frame or I-frame) at regular intervals. In case of a photo, this is equivalent to communicating the photo again from scratch. This might lead to the photo's appearance going through repeated cycles of blurry-to-crisp and blurry again. In contrast, described embodiments send the photo files to all participants so the photos appear the same to all participants, including the sender. - The described
video call devices 102 also consider the resolution of the display devices employed by video call participants, both for the primary video stream and for media sharing. During setup of a video call, eachvideo call device 102 informs the othervideo call devices 102 of its display device resolution. During the video call, thevideo call devices 102 employ transcoding to generate video streams of appropriate resolutions, thereby preserving video quality while reducing bandwidth usage. -
FIG. 5 shows aprocess 500 for thevideo calling system 100 ofFIG. 1 according to an embodiment where a firstvideo call device 102A transcodes video and shared local media content for a secondvideo call device 102B during a video call. Although in the described embodiments the elements ofprocess 500 are presented in one arrangement, other embodiments may feature other arrangements. For example, in various embodiments, some or all of the elements ofprocess 500 can be executed in a different order, concurrently, and the like. Also some elements ofprocess 500 may not be performed, and may not be executed immediately after each other. For clarity, only twovideo call devices 102A,B are shown inFIG. 5 . However, it should be understood that more than twovideo call devices 102 can participate inprocess 500. - Referring to
FIG. 5 , at 502, the secondvideo call device 102B sends a link partner media quality request to the firstvideo call device 102A. The link partner media quality request indicates a desired quality for video call video and the media content. For example, the link partner media quality request can indicate the resolution of the display device connected to the secondvideo call device 102B. - The
video call devices 102A,B then conduct a video call. In particular, the firstvideo call device 102A receives first audio information and first video information AV1 at 504, for example from alocal camera 232 andmicrophone 234. The secondvideo call device 102B receives second audio information and second video information AV2 at 506. Thevideo call devices 102A,B exchange the first and second audio and video information AV1 and AV2 at 510. The firstvideo call device 102A renders the second audio and video information AV2 at 512, for example ontelevision set 106A. The secondvideo call device 102B renders the first audio and video information AV1 at 514, for example ontelevision set 106B. However, the firstvideo call device 102A transcodes the first video information according to the desired quality at 508 prior to sending the first video information to the secondvideo call device 102B. In particular,CODEC 218 of the firstvideo call device 102A transcodes the first video information. This exchange can continue for the remainder ofprocess 500. - At 516, the first
video call device 102A receives local media content. For example, the firstvideo call device 102A can receive a photo stored on an SD card inserted inSD interface 252. At 518, the firstvideo call device 102A transcodes the media content according to the desired quality prior to sending the media content to the secondvideo call device 102B. In particular,CODEC 218 of the firstvideo call device 102A transcodes the media content. At 520, the firstvideo call device 102A sends the transcoded media content to the secondvideo call device 102B. While the media content is being transferred, the displays at both ends of the video call show the status of the transfer. - At 520, when the transcoded media content is available at
video call device 102B, thevideo call devices 102 exchange one or more synchronization commands over the command channel to indicate completion of the transfer. In response to the commands, thevideo call devices 102 render the media content at the same time. For example, thevideo call devices 102 render a photo simultaneously. As another example, thevideo call devices 102 begin playback of a video file simultaneously. In particular,video call device 102A renders the media content at 524, andvideo call device 102B renders the transcoded media content at 526. - In
FIG. 5 , a two-way exchange of synchronization commands is shown. In other embodiments, a single synchronization command can be sent. For example, the secondvideo call device 102B can send a single synchronization command to the firstvideo call device 102A when the transfer of the transcoded media content to the secondvideo call device 102B is complete, is sufficiently complete to begin playback, or the like. - Embodiments of the disclosure can be implemented in digital electronic circuitry, or in computer hardware, firmware, software, or in combinations of them. Embodiments of the disclosure can be implemented in a computer program product tangibly embodied in a machine-readable storage device for execution by a programmable processor; and method steps of the disclosure can be performed by a programmable processor executing a program of instructions to perform functions of the disclosure by operating on input data and generating output. The disclosure can be implemented advantageously in one or more computer programs that are executable on a programmable system including at least one programmable processor coupled to receive data and instructions from, and to transmit data and instructions to, a data storage system, at least one input device, and at least one output device. Each computer program can be implemented in a high-level procedural or object-oriented programming language, or in assembly or machine language if desired; and in any case, the language can be a compiled or interpreted language. Suitable processors include, by way of example, both general and special purpose microprocessors. Generally, a processor will receive instructions and data from a read-only memory and/or a random access memory. Generally, a computer will include one or more mass storage devices for storing data files; such devices include magnetic disks, such as internal hard disks and removable disks; magneto-optical disks; and optical disks. Storage devices suitable for tangibly embodying computer program instructions and data include all forms of non-volatile memory, including by way of example semiconductor memory devices, such as EPROM, EEPROM, and flash memory devices; magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM disks. Any of the foregoing can be supplemented by, or incorporated in, ASICs (application-specific integrated circuits).
- A number of implementations of the disclosure have been described. Nevertheless, it will be understood that various modifications may be made without departing from the spirit and scope of the disclosure. Accordingly, other implementations are within the scope of the following claims.
Claims (23)
1. A video call device comprising:
a video input interface configured to receive first video information;
an audio input interface configured to receive first audio information;
a transmitter configured to transmit first signals during a video call, wherein the first signals represent the first video information and the first audio information;
a receiver configured to receive second signals during the video call, wherein the second signals represent second video information and second audio information;
a video output interface configured to provide the second video information;
an audio output interface configured to provide the second audio information;
wherein the transmitter is further configured to transmit third signals during the video call, wherein the third signals represent at least one of
media content, and
a hyperlink, wherein the hyperlink indicates a location of the media content.
2. The video call device of claim 1 , further comprising:
an encoder/decoder (CODEC);
wherein the receiver is further configured to receive fourth signals, wherein the fourth signals represent a link partner media quality request, wherein the link partner media quality request indicates a desired quality for the first video information;
wherein the CODEC is configured to transcode the first video information according to the desired quality prior to the first signals being transmitted by the transmitter.
3. The video call device of claim 1 , further comprising:
a media interface configured to receive the media content;
wherein the third signals represent the media content.
4. The video call device of claim 5 , further comprising:
an encoder/decoder (CODEC);
wherein the receiver is further configured to receive fourth signals, wherein the fourth signals represent a link partner media quality request, wherein the link partner media quality request indicates a desired quality for the media content;
wherein the CODEC is configured to transcode the media content according to the desired quality prior to the third signals being transmitted by the transmitter.
5. The video call device of claim 5 , wherein the media interface comprises at least one of:
an SD card interface;
a USB interface; and
a mass storage interface.
6. The video call device of claim 1 , further comprising:
a processor configured to generate one or more first playback synchronization commands, wherein the first playback synchronization commands include timing information for playback of the media content;
wherein the transmitter is further configured to transmit fourth signals during the video call, wherein the fourth signals represent the one or more first playback synchronization commands.
7. The video call device of claim 6 , wherein:
the receiver is further configured to receive fifth signals during the video call, wherein the fifth signals represent one or more second playback synchronization commands; and
the processor is further configured to control playback of the media content according to the one or more second playback synchronization commands.
8. The video call device of claim 7 , wherein the playback synchronization commands represent at least one of:
a file transfer status for the media content;
a playback position for the media content; and
a time of a modification of the media content by a user.
9. The video call device of claim 1 , further comprising:
one or more cameras configured to provide the first video information to the video input interface; and
one or more microphones configured to provide the first audio information to the audio input interface.
10. A method comprising:
receiving first video information;
receiving first audio information;
transmitting first signals during a video call, wherein the first signals represent the first video information and the first audio information;
receiving second signals during the video call, wherein the second signals represent second video information and second audio information;
providing the second video information;
providing the second audio information; and
transmitting third signals during the video call, wherein the third signals represent at least one of
media content, and
a hyperlink, wherein the hyperlink indicates a location of the media content.
11. The method of claim 10 , further comprising:
receiving fourth signals, wherein the fourth signals represent a link partner media quality request, wherein the link partner media quality request indicates a desired quality for the first video information; and
transcoding the first video information according to the desired quality prior to transmitting the first signals.
12. The method of claim 10 , further comprising:
receiving the media content;
wherein the third signals represent the media content.
13. The method of claim 12 , further comprising:
receiving fourth signals, wherein the fourth signals represent a link partner media quality request, wherein the link partner media quality request indicates a desired quality for the media content; and
transcoding the media content according to the desired quality prior to transmitting the third signals.
14. The method of claim 10 , further comprising:
generating one or more first playback synchronization commands, wherein the first playback synchronization commands include timing information for playback of the media content; and
transmitting fourth signals during the video call, wherein the fourth signals represent the one or more first playback synchronization commands.
15. The method of claim 14 , further comprising:
receiving fifth signals during the video call, wherein the fifth signals represent one or more second playback synchronization commands; and
controlling playback of the media content according to the one or more second playback synchronization commands.
16. The method of claim 15 , wherein the playback synchronization commands represent at least one of:
a file transfer status for the media content;
a playback position for the media content; and
a time of a modification of the media content by a user.
17. Non-transitory computer-readable media embodying instructions executable by a computer to perform functions comprising:
receiving first video information and first audio information;
causing transmission of first signals during a video call, wherein the first signals represent the first video information and the first audio information;
providing second video information and second audio information based on second signals received during the video call, wherein the second signals represent the second video information and the second audio information;
causing transmission of third signals during the video call, wherein the third signals represent at least one of
media content, and
a hyperlink, wherein the hyperlink indicates a location of the media content.
18. The non-transitory computer-readable media of claim 17 , wherein the functions further comprise:
receiving a link partner media quality request, wherein the link partner media quality request indicates a desired quality for the first video information; and
transcoding the first video information according to the desired quality prior to causing transmission of the first signals.
19. The non-transitory computer-readable media of claim 17 , wherein the functions further comprise:
receiving the media content;
wherein the third signals represent the media content.
20. The non-transitory computer-readable media of claim 19 , wherein the functions further comprise:
receiving a link partner media quality request, wherein the link partner media quality request indicates a desired quality for the media content; and
transcoding the media content according to the desired quality prior to causing transmission of the third signals.
21. The non-transitory computer-readable media of claim 17 , wherein the functions further comprise:
generating one or more first playback synchronization commands, wherein the first playback synchronization commands include timing information for playback of the media content; and
causing transmission of fourth signals during the video call, wherein the fourth signals represent the one or more first playback synchronization commands.
22. The non-transitory computer-readable media of claim 21 , wherein the functions further comprise:
receiving one or more second playback synchronization commands; and
controlling playback of the media content according to the one or more second playback synchronization commands.
23. The non-transitory computer-readable media of claim 22 , wherein the playback synchronization commands represent at least one of:
a file transfer status for the media content;
a playback position for the media content; and
a time of a modification of the media content by a user.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/470,336 US20120287231A1 (en) | 2011-05-12 | 2012-05-13 | Media sharing during a video call |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201161485233P | 2011-05-12 | 2011-05-12 | |
US201161485229P | 2011-05-12 | 2011-05-12 | |
US201161485237P | 2011-05-12 | 2011-05-12 | |
US13/470,336 US20120287231A1 (en) | 2011-05-12 | 2012-05-13 | Media sharing during a video call |
Publications (1)
Publication Number | Publication Date |
---|---|
US20120287231A1 true US20120287231A1 (en) | 2012-11-15 |
Family
ID=47141621
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/470,337 Abandoned US20120287219A1 (en) | 2011-05-12 | 2012-05-13 | Wireless network device configuration using image capture |
US13/470,339 Expired - Fee Related US8368737B2 (en) | 2011-05-12 | 2012-05-13 | Smart remote control devices for controlling video call devices |
US13/470,336 Abandoned US20120287231A1 (en) | 2011-05-12 | 2012-05-13 | Media sharing during a video call |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/470,337 Abandoned US20120287219A1 (en) | 2011-05-12 | 2012-05-13 | Wireless network device configuration using image capture |
US13/470,339 Expired - Fee Related US8368737B2 (en) | 2011-05-12 | 2012-05-13 | Smart remote control devices for controlling video call devices |
Country Status (1)
Country | Link |
---|---|
US (3) | US20120287219A1 (en) |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130201274A1 (en) * | 2012-02-03 | 2013-08-08 | Samsung Electronics Co. Ltd. | Video telephony system and control method thereof |
US20140267578A1 (en) * | 2013-03-14 | 2014-09-18 | Samsung Electronics Co., Ltd. | Video injection for video communication |
US20160021338A1 (en) * | 2014-07-17 | 2016-01-21 | Htc Corporation | Method for performing a video talk enhancement function and an electric device having the same |
US20160134602A1 (en) * | 2014-11-06 | 2016-05-12 | Intel Corporation | Secure sharing of user annotated subscription media with trusted devices |
US9344681B2 (en) * | 2014-08-21 | 2016-05-17 | Infocus Corporation | Systems and methods of incorporating live streaming sources into a video conference |
US9374788B2 (en) | 2013-12-19 | 2016-06-21 | Sandisk Technologies Inc. | Mobile device peripheral |
US9490650B2 (en) * | 2012-08-02 | 2016-11-08 | Sandisk Technologies Llc | Wireless power transfer |
US9521223B1 (en) | 2015-10-22 | 2016-12-13 | Sandisk Technologies Llc | Mobile device case and method for use therewith |
US9986080B2 (en) | 2016-06-24 | 2018-05-29 | Sandisk Technologies Llc | Mobile device and method for displaying information about files stored in a plurality of storage devices |
US10057305B2 (en) | 2014-09-10 | 2018-08-21 | Microsoft Technology Licensing, Llc | Real-time sharing during a phone call |
US10235366B2 (en) | 2016-08-16 | 2019-03-19 | Microsoft Technology Licensing, Llc | Activity gallery view in communication platforms |
US10630937B1 (en) * | 2018-12-19 | 2020-04-21 | Motorola Solutions, Inc. | Device, system and method for transmitting one or more of annotations and video prior to a video call |
US10681300B1 (en) * | 2019-02-14 | 2020-06-09 | Avaya Inc. | Split screen for video sharing |
US11272139B2 (en) * | 2018-10-18 | 2022-03-08 | Sony Group Corporation | User interface for video call with content sharing |
WO2024046584A1 (en) * | 2022-09-02 | 2024-03-07 | G-Core Innovations S.À.R.L | Method of joint viewing remote multimedia content |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013137787A1 (en) * | 2012-03-15 | 2013-09-19 | Telefonaktiebolaget L M Ericsson (Publ) | A configuration provision device and corresponding m2m device, system, method, computer program and computer program product |
CN103516674B (en) * | 2012-06-21 | 2016-10-12 | 棣南股份有限公司 | Quickly and the method for network device online and control device |
US8970656B2 (en) * | 2012-12-20 | 2015-03-03 | Verizon Patent And Licensing Inc. | Static and dynamic video calling avatars |
JP2016514380A (en) * | 2013-01-15 | 2016-05-19 | ヴィバー メディア エスアーエールエル | Use of Smart TV function to enhance voice / video phone |
US9172908B2 (en) | 2013-06-18 | 2015-10-27 | Microsoft Technology Licensing, Llc | Unpaired devices |
US9681189B2 (en) | 2013-06-20 | 2017-06-13 | Microsoft Technology Licensing, Llc | Paired devices |
JP5774648B2 (en) * | 2013-08-08 | 2015-09-09 | 三菱電機株式会社 | Control system, control method, electric device, external controller, and program |
US9253439B2 (en) * | 2014-02-24 | 2016-02-02 | Cellco Partnership | Managing complex video call scenarios in volte calls |
US10827539B2 (en) | 2014-03-06 | 2020-11-03 | Gainspan Corporation | Remote provisioning of wireless stations with confirmation |
US10305966B2 (en) * | 2014-05-23 | 2019-05-28 | Anders Edvard Trell | System for authorization of access |
US9730255B1 (en) * | 2016-08-30 | 2017-08-08 | Polycom, Inc. | Room-specific pairing via a combined ultrasonic beacon/bluetooth approach |
TWI742394B (en) * | 2019-07-03 | 2021-10-11 | 神雲科技股份有限公司 | Server |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090273712A1 (en) * | 2008-05-01 | 2009-11-05 | Elliott Landy | System and method for real-time synchronization of a video resource and different audio resources |
US20120013705A1 (en) * | 2010-07-15 | 2012-01-19 | Cisco Technology, Inc. | Switched multipoint conference using layered codecs |
US20120020456A1 (en) * | 2001-10-19 | 2012-01-26 | Hologic, Inc., | Mammography system and method employing offset compression paddles automatic collimation and retractable anti-scatter grid |
US20120050456A1 (en) * | 2010-08-27 | 2012-03-01 | Cisco Technology, Inc. | System and method for producing a performance via video conferencing in a network environment |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6512919B2 (en) | 1998-12-14 | 2003-01-28 | Fujitsu Limited | Electronic shopping system utilizing a program downloadable wireless videophone |
WO2001058151A2 (en) * | 2000-02-04 | 2001-08-09 | Intel Corporation | Displaying enhanced content information on a remote control unit |
EP2980991B1 (en) * | 2001-07-12 | 2018-10-24 | Sony Corporation | Remote controller and system having the same |
US8054854B2 (en) * | 2004-08-26 | 2011-11-08 | Sony Corporation | Network remote control |
JP2006261938A (en) * | 2005-03-16 | 2006-09-28 | Sony Corp | Communications system, communications apparatus and method, recording medium, and program |
US20090284577A1 (en) * | 2008-05-13 | 2009-11-19 | Avi Kumar | Video telephone system and method |
CN102428405A (en) | 2009-06-16 | 2012-04-25 | 英特尔公司 | Camera applications in a handheld device |
NO332170B1 (en) * | 2009-10-14 | 2012-07-16 | Cisco Systems Int Sarl | Camera control device and method |
KR101682245B1 (en) * | 2010-06-18 | 2016-12-02 | 엘지전자 주식회사 | Display apparatus and method for connecting to video call thereof |
-
2012
- 2012-05-13 US US13/470,337 patent/US20120287219A1/en not_active Abandoned
- 2012-05-13 US US13/470,339 patent/US8368737B2/en not_active Expired - Fee Related
- 2012-05-13 US US13/470,336 patent/US20120287231A1/en not_active Abandoned
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120020456A1 (en) * | 2001-10-19 | 2012-01-26 | Hologic, Inc., | Mammography system and method employing offset compression paddles automatic collimation and retractable anti-scatter grid |
US20090273712A1 (en) * | 2008-05-01 | 2009-11-05 | Elliott Landy | System and method for real-time synchronization of a video resource and different audio resources |
US20120013705A1 (en) * | 2010-07-15 | 2012-01-19 | Cisco Technology, Inc. | Switched multipoint conference using layered codecs |
US20120050456A1 (en) * | 2010-08-27 | 2012-03-01 | Cisco Technology, Inc. | System and method for producing a performance via video conferencing in a network environment |
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8982174B2 (en) * | 2012-02-03 | 2015-03-17 | Samsung Electronics Co., Ltd. | Video telephony system and control method thereof |
US20130201274A1 (en) * | 2012-02-03 | 2013-08-08 | Samsung Electronics Co. Ltd. | Video telephony system and control method thereof |
US9490650B2 (en) * | 2012-08-02 | 2016-11-08 | Sandisk Technologies Llc | Wireless power transfer |
US20140267578A1 (en) * | 2013-03-14 | 2014-09-18 | Samsung Electronics Co., Ltd. | Video injection for video communication |
US9363472B2 (en) * | 2013-03-14 | 2016-06-07 | Samsung Electronics Co., Ltd. | Video injection for video communication |
US9374788B2 (en) | 2013-12-19 | 2016-06-21 | Sandisk Technologies Inc. | Mobile device peripheral |
US20160021338A1 (en) * | 2014-07-17 | 2016-01-21 | Htc Corporation | Method for performing a video talk enhancement function and an electric device having the same |
US9609270B2 (en) * | 2014-07-17 | 2017-03-28 | Htc Corporation | Method for performing a video talk enhancement function and an electric device having the same |
US9344681B2 (en) * | 2014-08-21 | 2016-05-17 | Infocus Corporation | Systems and methods of incorporating live streaming sources into a video conference |
US10057305B2 (en) | 2014-09-10 | 2018-08-21 | Microsoft Technology Licensing, Llc | Real-time sharing during a phone call |
US20160134602A1 (en) * | 2014-11-06 | 2016-05-12 | Intel Corporation | Secure sharing of user annotated subscription media with trusted devices |
US9800561B2 (en) * | 2014-11-06 | 2017-10-24 | Intel Corporation | Secure sharing of user annotated subscription media with trusted devices |
US9521223B1 (en) | 2015-10-22 | 2016-12-13 | Sandisk Technologies Llc | Mobile device case and method for use therewith |
US9986080B2 (en) | 2016-06-24 | 2018-05-29 | Sandisk Technologies Llc | Mobile device and method for displaying information about files stored in a plurality of storage devices |
US10122840B2 (en) | 2016-06-24 | 2018-11-06 | Sandisk Technologies Llc | Displaying information about files stored in a plurality of storage devices |
US10235366B2 (en) | 2016-08-16 | 2019-03-19 | Microsoft Technology Licensing, Llc | Activity gallery view in communication platforms |
US11272139B2 (en) * | 2018-10-18 | 2022-03-08 | Sony Group Corporation | User interface for video call with content sharing |
US10630937B1 (en) * | 2018-12-19 | 2020-04-21 | Motorola Solutions, Inc. | Device, system and method for transmitting one or more of annotations and video prior to a video call |
US10681300B1 (en) * | 2019-02-14 | 2020-06-09 | Avaya Inc. | Split screen for video sharing |
WO2024046584A1 (en) * | 2022-09-02 | 2024-03-07 | G-Core Innovations S.À.R.L | Method of joint viewing remote multimedia content |
Also Published As
Publication number | Publication date |
---|---|
US20120287220A1 (en) | 2012-11-15 |
US8368737B2 (en) | 2013-02-05 |
US20120287219A1 (en) | 2012-11-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20120287231A1 (en) | Media sharing during a video call | |
US10187668B2 (en) | Method, system and server for live streaming audio-video file | |
US8554938B2 (en) | Web browser proxy-client video system and method | |
EP2789120B1 (en) | Collaboration system and method | |
KR102157634B1 (en) | Image capturing method and local endpoint host device | |
US8803991B2 (en) | Snapshot capture in video stream | |
US9276997B2 (en) | Web browser proxy—client video system and method | |
US20100242066A1 (en) | Method of Performing Random Seek Preview for Streaming Video | |
US9137489B2 (en) | Platform for end point and digital content centric real-time shared experience for collaboration | |
KR100889367B1 (en) | System and Method for Realizing Vertual Studio via Network | |
US11489891B2 (en) | Virtual video driver bridge system for multi-source collaboration within a web conferencing system | |
WO2006042159A2 (en) | Interactive video collaboration framework | |
KR20130138263A (en) | Streaming digital video between video devices using a cable television system | |
US20140028778A1 (en) | Systems and methods for ad-hoc integration of tablets and phones in video communication systems | |
KR101942269B1 (en) | Apparatus and method for playing back and seeking media in web browser | |
US9407895B2 (en) | Apparatus and method for controlling a video | |
KR20170071251A (en) | Multi-point control unit for providing conference service | |
CN113573004A (en) | Video conference processing method and device, computer equipment and storage medium | |
CN107852523B (en) | Method, terminal and equipment for synchronizing media rendering between terminals | |
US9426415B2 (en) | System, method and architecture for in-built media enabled personal collaboration on endpoints capable of IP voice video communication | |
US20240098333A1 (en) | Video Playback based on an HTML iframe and a Headless Browser | |
Calvo‐Flores et al. | Integrating multimedia streaming from heterogeneous sources to JavaME mobile devices | |
JP6431301B2 (en) | Movie processing apparatus, method, and computer program | |
TWI538493B (en) | Multimedia system, cross - platform real - time sharing system and method | |
Boyaci | Advancing Multimedia: Application Sharing, Latency Measurements and User-Created Services |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: TELY LABS, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:RAVI, SREEKANTH;RAVI, SUDHAKAR;ZULLO, JEREMY;AND OTHERS;REEL/FRAME:028240/0897 Effective date: 20120517 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |