EP0581101A1 - Audio/video communications processor - Google Patents

Audio/video communications processor Download PDF

Info

Publication number
EP0581101A1
EP0581101A1 EP19930111135 EP93111135A EP0581101A1 EP 0581101 A1 EP0581101 A1 EP 0581101A1 EP 19930111135 EP19930111135 EP 19930111135 EP 93111135 A EP93111135 A EP 93111135A EP 0581101 A1 EP0581101 A1 EP 0581101A1
Authority
EP
Grant status
Application
Patent type
Prior art keywords
video
audio
communication processor
processor
system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP19930111135
Other languages
German (de)
French (fr)
Other versions
EP0581101B1 (en )
Inventor
Joseph Claude Caci
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
International Business Machines Corp
Original Assignee
International Business Machines Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems
    • H04N7/152Multipoint control units therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • H04N7/147Communication arrangements, e.g. identifying the communication as a video-communication, intermediate storage of the signals

Abstract

A communications processor serves a group of several workstations with audio and video transmission processing for the purpose of providing video conferencing.
The communication processor utilizes artificial intelligence software to read the connection. Conversion rules contained in tables so that the system can react to the communications environment. The system is coupled for processing optical signals for low cost communication and video conferencing with audio and video communications within the facility area and for long haul transmission. The communication processor provides audio and video communications under instantaneous constraints of the transmission medium and instantaneous degree of loading or usage. Bandwidth, resolution and transmission rate are adjustable to fit the constraints at the time a request for service is made. A workstation initiates a request for service. A request for service includes data about the nature or type of service and signal destination. This information is sufficient for the communication processor to make several attempts to threads before an affirmative determination can be made. If an affirmative determination is not possible, then the communication processor will determine which is possible and provide an output to the user for possible changes in a request.

Description

    FIELD OF THE INVENTION
  • This invention relate to an audio/video communication processor for field of telecommunications and computers and especially to those systems which serve a group of workstation equipped users with the means to communicate audibly and visually.
  • GLOSSARY OF TERMS
    • WEIGHTING - A method by which software identifies a group of binary numbers as having specific place; i.e. hundreds, thousands in decimal system; in a fixed binary number scheme.
    • COMPRESSION - A method by which software or hardware is used to generate codes which control substitution of long strings of binary data with one or more order of magnitude smaller replacement codes. The code may be in modulation format or binary data format.
    • PEL - A term to describe an element of video data which is digital in nature. A binary number which represents the range of luminosity, chrominance and hue in a digital video system. The smallest independent unit of visual acuity represented by a binary number on a display screen. Concatenation is of Picture ELement, from which PEL arises.
    • PIXEL - Another term for pel.
    • COMMUNICATIONS PROCESSOR - A term used to describe a new class of communications equipment which is used to connect workstation equipped users with the resources to video conference over telecommunications networks. It consists of several subprocessors which in aggregate form the communications processor. Concatenated form; communication processor.
    • PACKET - An assemblage of binary data on the network side of the communication processor which contains network and communication processor control information, video data, and audio data.
    • DACS - Digital Access Cross Connect, a machine for interchanging 64,000 bit per second time slots in either DS or non DS format between several T1 carriers independent of the signaling codes in the DS or T1 carrier. DS refers to DS1, DS2, or DS3 digital telephone channel coding formats. The basic rate of 64,000 bps may be a non telephone channel which could contain any type of binary data. T1 refers to the digital multiplex basic rate 1.544 Mbps which may or may not consist of 24 DS type digital telephone channels.
    • TASI - Time Assigned Speech Interpolation is a term that refers to compressing speech. Speech or properly digital voice is processed either according to mu-law or a-law weighting. One is a North American standard and the other is European. Other countries either use the North American or European or CCITT standard. International circuits are required to convert between the two as they are not compatible. TASI algorithms read the data and eliminate data which represents voice silence so only the actual speech sounds are transmitted. Additionally TASI may insert codes for certain known patterns of voice data which offers additional compression. The bandwidth thus saved is used for other purposes.
    • Artificial Intelligence (AI)- Is a software program which reacts to and causes events to occur based on a set of rules. The rules are not hard and fast necessarily. They are generally soft rules such as instructions given to a neophyte. A crucial characteristic of these rules is that they cover all possible events, even those of low probability. They cover the permutation of events in any combination that is possible. The objective in formulating these rules is not to allow a situation with inadequate or impossible reactions to the external stimuli.
    • Pleisochronous - A characteristic of two or more clocking circuits such that, once a synchronizing process is performed and withdrawn, they remain synchronized for long periods of time. A solar day would be a minimally acceptable long period of time.
    BACKGROUND OF THE INVENTIONS
  • As background for the invention, a few patents are mentioned and then turn to other works which relate to products having some elements which may be relevant to a product made using the invention. United States Patent 4,949,169, issued 14 AUG 90 to LUMELSKY, and assigned to International Business Machines Corp., describes an interface architecture for interconnecting a number of video display devices together over a high speed digital communication link having limited bandwidth. The interface architecture at each display node provides for transmitting sequential pixels of data composed of separate Y and C fields from a digital TV source in each node representative of a scaled video window. Audio information is transmitted with the video on portions of the network bandwidth not used by the video. It is an object of this invention to provide a hardware system which allows use of existing hardware in the various video display devices and associated communications adapters such that minimum additional control hardware and software is required.
  • United States Patent 4,780,761, issued 25 OCT 88, to DALY et al and assigned to Eastman Kodak Company related to a device which recognizes that human visual system is less sensitive to diagonally oriented spatial frequencies than to horizontal or vertical ones. The transceiver has a way of quantifying the transform coefficients according to the model of the human vision system. This system is not designed to video conference workstations over the telecommunications network. It does not time share a video subprocessor as part of a audio/video communication processor and it does not function in a network control manner and hence cannot network.
  • United States Patent 4,494,144, issued 15 JAN 85, to BROWN and assigned to AT&T describes a reduced bandwidth video transmission with good video presence. The bandwidth reduction is accomplished by dividing the video picture of the camera into segments by determining the activity level within each segment and by transmitting the signal of each segment with a resolution level which is related to the activity level within the segment. The most active segment is transmitted at the highest resolution while other segments are transmitted at lower resolutions. This system is not designed to video conference workstations over the telecommunications network. It does not time share a video subprocessor as part of a audio/video communication processor and hence cannot network.
  • United States Patent 4,862,264, issued 29 AUG 89, to WELLS and assigned to British Broadcasting Corporation describes a method of coding a video signal for transmission in a restricted bandwidth by subdividing a frame of picture information into a set of constituent blocks, measuring the amount of picture activity in each block, sampling the information in each block at a rate related to the amount of picture activity in that block, and adding to the coded block a supplementary signal indicating the rate used for that block. A decision is made for each block to transmit at full accuracy or reconstruct it from the previous frame. In effect each block is sampled twice simultaneously. The first sampling is at a subrate and the second sampling is at the Nyquest rate. a block activity generator and motion activity generator are used to make decisions on transmitting the high accuracy or low accuracy. The samples may be transmitted in analog or digital form.
  • United States Patent 4,654,484, issued 31 MAR 87, to REIFFEL and assigned to INTERAND CORP describes an improved apparatus for rapidly compressing, expanding and displaying broad band information which is transmitted over narrow band communications channel. A video image is cyclically assembled in low resolution and high resolution phases from digitized data representing gray level intensity for individual pixels which have been grouped into pixel. During the initial cycle of the low resolution phase, a representative sample of cell intensity values is transmitted by a sending station to a receiving station according to a video compression routine. The receiving station then used a video expansion routine to calculate an intensity value for those pixels whose intensity values were not transmitted and displays an initial image.
  • United States Patent 4,682,225, issued 21 JUL 87 to GRAHM, and assigned to NASA, describes a method of, and apparatus for, telemetry adaptive bandwidth compression. An adaptive sampler from a video signal generates a sequence of sampled fields. Each field and it's range rate information are sequentially transmitted to and stored in a multiple adaptive field storage means. The patented apparatus may be used in spacecraft docking systems wherein vast amounts of video information and data must be transmitted in limited finite bandwidth. This invention is suited for space communication systems from a spacecraft of both video and data signals. In particular, a manual signal can control parameters such as range rate, sampling ratio, number of low resolution frames of video simultaneously displayed or portion of the down link communicati bandwidth allocated between data and video.
  • While this patent has little relationship with the preferred application of the invention, it will be noted that this sophisticated system may be significantly improved in the invention. During a video conference, a user may request reinitializing at any time. The communication processor also processes for recognition of degraded transmission data and automatic reinitializing. The communication processor controls reinitializing so as not to disrupt the existing channel bandwidth allocation among the users.
  • Patent 4,739,413, issued 19 APR 88 to MEYER , and assigned to LUMA TELECOM , describes a method of video optimized modulator demodulator with adjacent modulating amplitudes matched to adjacent pixel gray values. Each modulating symbol has a one to one correspondence with a particular pixel value of brightness. United States Patent 3,795,763, issued 5 MAR 74 to GOLDING, and assigned to COMSAT CORP, describes a method of digital television transmission system for transmitting at substantially reduced bit rate and bandwidth. Frequency interleaving techniques reduce the sampling rate and digital differential PCM with edge recoding techniques reduce the number of bits per sample. Further reduction in bit rate is accomplished by eliminating about half the color data and all the sync pulses from the transmitted signal. Periodic sync words are transmitted to allow reconstruction of sync information. Transmitted bits are multiplexed in accordance with a particular format which provides proper alignment of the luminance and chrominance lines at the receiver. The Y & C are separated and sampled at less than the Nyquest rate. The samples are quantified and converted into difference samples having further bit reduction. The audio is sampled at the horizontal scan rate and the digital representations of audio and video are serially multiplexed into an output stream. Every other pair of C is completely eliminated from the multiplexed serial bit stream but is reconstructed at the receiver from adjacent C information.
  • United States Patent 5,043,810, issued 27 AUG 91 to VREESWIJK, and assigned to US PHILIPS, describes a method of improved transmitting or recording an improved video signal processing apparatus and an improved receiving apparatus of a television signal. The method of processing is spatial and or temporal consistency control of a selection relating to spatially and or temporally neighboring parts of the image. The decision process includes neighboring parts of the image that may or may not have an effect on the part being processed. A block of pixels constitutes a part of the image and is sampled in accordance with a sampling pattern not corresponding to that operation and which block adjoins a block which is sampled with a sampling pattern corresponding to that operation, to the corresponding sampling pattern.
  • United States Patent 4,720,745, issued 19 JAN 88 to DEFOREST, and assigned to DIGIVISION Inc., describes a method and apparatus for enhancing video displays. An NTSC composite video signal is dematrixed and its RGB components are digitized into a 512 x 512 frame pixel array. One high resolution frame is generated from each input frame. The subpixel values for a given pixel are derived by examining the nearest neighboring pixels and using enhancement algorithms represented by data in lookup tables. Signal to noise rations are handled by comparing and deciding to change the value of a pixel based on the value of the nearest neighbors or replace it with the median of it and its neighbors.
  • United States Patent 4,858,026, issued 15 AUG 89 to RICHARDS, and assigned to US PHILIPS, describes a method of coding an image to be displayed. The image is coded using data compression which consists of first obtaining pixel information as a first matrix of high resolution. A second matrix of lower resolution is devised like the first through low pass filtering. A third matrix is the difference between the two. A fourth matrix is produced by subsampling the second matrix, (not every pixel is used). The third and fourth matrices are coded. complementary decoding consists in restituting the second matrix and combining the restituted second matrix by interpolation filtering the decoded fourth matrix and combining the restituted second matrix with the decoded third matrix. This method has applications such as compact disk image encoding, but the communication processor does not work accord the principles set forth in this patent.
  • United States Patent 4,733,299, issued 22 MAR 88 to GLENN, and assigned to NYIT, describes a method for conversion of interlaced scanned video signals to progressive scanned video signals. The applicant has learned that motion adaptive processing is not required. Low resolution information is obtained from the current interlaced field and the remaining detailed information is obtained from a stored signal that includes a prior field or fields. Only the detail signal is obtained from prior fields and since human vision does not as quickly perceive motion of high spatial frequencies, there will be little if any, perceived motion artifacts.
  • United States Patent 4,551,755, issued 5 NOV 85 to MATSUDA, and assigned to PIONEER ELECTRIC CO, describes a method of bandwidth correcting system for a television tuner. A bandwidth control voltage is applied to a bandwidth adjusting circuit provides a passband width which is determined by the relative levels of a video intermediate frequency signal and an audio intermediate frequency signal. This patent seems to imply that the transmission process will over modulate, or exceed the allocated bandwidth for an instant from time to time and that a correcting signal will cause the receiver to recognize this condition and adjust.
  • The communication processor has a different application, process and method than described in 4,551,755. However the concept of bandwidth correcting and bandwidth allocation must be differentiated. A bandwidth allocation from video to voice or the reverse, does not necessarily need to be made and if it is made, it is not for the purpose to correct a malfunction such as over modulation, but rather to manage both types of bandwidth.
  • United States Patent 4,792,993, issued 20 DEC 88 to MA, and assigned to CAPETRONIC (BSR) Ltd., describes a method of improved TVRO (TeleVision Receive Only, usually refers to TV by satellite). The improvement is in automatic filtering the audio signals to a frequency range of the band outside that of the modulated video signal, combining the filtered audio signals and the video signal and transmitting such signals through the restricted bandwidth channel. At the distant end the reverse process separates the signals into audio and video.
  • EN8910260 has nothing in common with this patent.
  • United States Patent 4,849,811 issued 18 JUL 89 to KLEINERMAN and assigned to KLEINERMAN describes a method for simultaneously sending audio and video signals over standard telephone lines or other channel having restricted bandwidth which comprises obtaining a video image, digitising the image, modulating a signal with the digitized image, obtaining audio signals and filtering the audio signals to a frequency range of the band outside that of the modulated video signal, combining the filtered audio signals and the video signal and transmitting such signals through the restricted bandwidth channel. At the distant end the reverse process separates the signals into audio and video.
  • The communication processor has some elements in common with this patent such a digitizing the video and processing it digitally, but it improves on it by combining the audio and video together together into a continuous channel frame separated only by software protocol. Patent 4,849,811 separates the audio and video in the restricted bandwidth frame by modulating the video and audio separately. The communication processor does not use a telephone channel. Instead, it uses a digital trunk, the smallest of which is 64,000 serial bits per second. The long distance carriers can provide this as trunk capacity or leased lines of which 64,000 serial bits per second provides minimal video conferencing service. Best performance is obtained with high bandwidth network carriers.
  • United States Patent 4,425,642, issued 10 JAN 84 to MOSES, and assigned to APP SPEC TECH Inc., describes a method of co-channel communications system which permits a digital data signal to be simultaneously transmitted within a communications medium signal such a telephone voice or television video. The data signals are converted to very low multifrequency signals consisting of fundamental frequencies and harmonics which span the communications bandwidth. The data signal is spread spectrum modulated, its energy content is spread over the entire band causing a small degradation in signal to noise by adding what appears to be pseudo noise to the audio or video signal. Since the data signals coherently produce the pseudo noise it is detected coherently and removed from the audio or video at the receiver.
  • The communication processor does not perform spread spectrum modulation, nor does co-channel data with voice or video. Rather the communication processor uses a protocol to keep audio, video, and data in a serial transmission channel.
  • United States Patent 3,873,771, issued 25 MAR 75 to KLEINERMAN, and assigned to TELSCAN , describes a system for simultaneously transmitting a video and audio signal through through the same transmission line using FM slow scan TV while the audio signal is transmitted by AM single sideband technique. Both the video and audio occupy the channel at the same time in separate frequency regions. The communication processor need not use any analog modulation techniques, or frequency multiplexing techniques or slow scan FM TV techniques.
  • United States Patent 4,797,750, issued 10 JAN 89 to KARWEIT, and assigned to J. HOPKINS U., describes a method and apparatus for transmitting a recorded computer generated display simultaneously with the transmission of audio and or video signals. A computer generates a series of codes from which an image may be derived, the resolution of which is not dependent on the recording transmission medium. These codes are supplied to a first modem through an RS-232 communications line. The first modem converts these codes to into image bearing audio tones. The audio tones are input in the left audio channel of a video recorder. Simultaneously, aural information is picked up by a microphone and input into the right audio channel while a video camera provides video signals to the video channel of the recorder. On playback the audio in the left channel is decoded by the modem and reconverted back to computer generated display. The communication processor does not perform spread spectrum modulation, nor do use a RS-232 communications line. Instead, it is illustrated a different kind of use of a modem.
  • United States Patent 4,736,407, issued 5 APR 88 to DUMAS, and assigned to US ARMY, describes a method of audiographic conferencing system between two or more users either directly connected or through a bridging device over voice grade telephone lines. Each user has a personal computer, software and a smart modem, cassette player/recorder and speaker phone. They are connected as shown in figure 1. The smart modems listen for a bauded signal, if present decode it and pass it to the computer, the speaker phone allows the user to listen to speech while being under software control. The cassette recorder/player is used for unattended operation.
  • The communication processor differs from this patent in that video conferencing performed by this patent, rather computer data and voice is conferenced.
  • United States Patent 4,955,048, issued 4 SEPT 90 to IWAMURA, and assigned to SHARP Kabushiki Kaisha, describes a method for multiplexing the transmission of audio and video signals, the video signal is separated into a luminance (Y) and a chrominance (C) signal. The Y signal is then modulated and the C signal is balanced modulated with a low frequency carrier. The resultant C modulated signal is converted to a lower frequency. The audio signal, frequency modulated Y signal and frequency converted C signal are multiplied by frequency division to be transmitted across a telephone cable.
  • United States Patent 4,999,831, issued 12 MAR 91 to GRACE, and assigned to UNITED TELCOMM Inc, describes a method of digital transmission of wideband video, narrowband audio, and digital information over information networks. It describes synchronous quantitized subcarrier multiplexing which results in electronic multiplexing of voice, data and multiple channel full bandwidth NTSC video for digital transmission over communication line and recovery processing. The channels to be multiplexed must be carefully chosen for frequency content so as not to interfere with each other, then the signals are low pass filtered and modulated with local reference signals (Double Side Band Suppressed Carrier DSBSC) and consequently form baseband, midband and high band channels which are combined and input into a D/A converter. This results in a serial bit stream known as quantitized-SCM. This patent is unlike the present invention.
  • United States Patent 5,027,400, issued 25 JUN 91 to BAJI, and assigned to HITACHI Ltd, describes a multimedia bidirectional broadcast system. The main control unit receives information over a network from subscriber stations. Software in the main unit decodes the request from the subscriber station and provides the service by controlling all transmission processes. The service may be a motion picture or a commercial data base. Transmission also includes bandwidth compression on a video signal. This system is described as providing a broadband ISDN broadcast system, and to provide CATV with means for using a limited number of cable channels.
  • The communication processor differs from this patent in that the patent is designed for interactive advertising on ISDN broadband networks where shoppers can see video images of the product and interact with the master station to perform transactions. This patent is also applicable to CATV systems where customers can order video programming services selectively rather than having technicians hard wire customer requested programming services. The communication processor intended to do workstation video conferencing.
  • United States Patent 4,541,008, issued 10 SEPT 85 to FISHMAN, and assigned to JONES FUTURA FOUNDATION Ltd., describes a television signal transmission system incorporating circuits for processing and encoding a repetition reduced signal. The system separates the video components and generates sampled digital values of the color, intensity components and puts them in a storage buffer. A data processor compares successive samples of the component video data from which it generates variable length blocks of video data to represent slowly varying signals or rapidly varying signals. There is circuitry for encoding and multiplexing audio and synchronizing data into the signal stream, circuitry for encoding signal and control data for transmission to a receiver and a circuit at the receiver for reversing this process. The process of reducing repetition is to use variable velocity scanning by using codes to indicate when color, intensity, and luminance information is repeatable. When information is repeatable, only every eighth sample is transmitted. Timing information to control the scan rate is crucial. The orderly progression of the line scan now depends on circuits to make up the rate tag with interpolation data. The communication processor on the other hand reduces repetitious video data.
  • United States Patent 4,394,774, issued 19 JUL 83 to WIDERGREN, and assigned to COMPRESSION LABS Inc, describes a method of digital video compression and expansion system and the methods for compressing and expanding digital video signals in real time at rates up to NTSC color broadcast rates. The system compressor receives digital frames and divides them into subframes, performs a single pass spatial domain to transform domain transformation in two dimensions of picture elements. The resultant coefficients are normalized and compressed using a predetermined ratio. There is an adaptive rate buffer to control feedback for compression. The compressor adaptively determines the rate buffer capacity control feedback component in relation to instantaneous data content of the rate buffer memory in relation to its capacity, and it controls the absolute quantity of data resulting from the normalization step so the buffer is never empty or full. In practice, the color picture is divided into luminance, and I and Q chrominance components. The luminance component is compressed and expanded with the scene adaptive coding rate buffer feedback technique. The I and Q components are given simple spatial low pass filtering followed by spatial subsampling with dimensional interpolation at the system receiver. The audio is filtered and sampled at a fixed rate and muxed together with bit screen synchronization codes and transmitted as a serial bit stream.
  • ADDITIONAL BACKGROUND AS TO OTHER PRODUCTS
  • During the detailed description which follows the following works will be referenced as an aid for the reader. These additional references are:
    These additional references are incorporated by reference.
    • 1) Article in April 1, 1991 issue of PC Week Journal on Page 43 entitled "Analysts Expect Video Meetings to Boom in 90's" by Michael Zimmerman. Industry experts prognosticate growth in video conferencing as a form of communication.
    • 2) Stanford Computer Optics Inc. 3530 Sugarberry Lane, P.O. Box 31266, Walnut Creek CA 94598, makes a product called "4 Quick" that is an image acquisition device that makes 30 to 60 frames a second with variable delay to make a 512 by 512 low light image. It operates between the wavelengths of 130 - 920 nm and can make a frame in as little as 5 ns. This device would be applicable for making image frames for several users in a time share arrangement.
    • 3) Welch Allen Inspection Systems Division. 4619 Jordan Road, Skaneateles Falls, New York 13153 makes a product called a "VP3 Videoendoscope". A small video probe, miniature video imaging camera slightly larger than a fountain pen makes the image and sends it along an umbilical cord back to a device where the image is compressed and modulated for transmission onto a telephone line at speeds from 14,400 bps to 2,400 bps. This device represents a possible integration into a display bezel.
    • 4) Dialogic, 300 Littleton Road, Parsippany N.J. makes a "Call Processor" and "Audiotex" Information program for voice messaging and automated attendant telephone network services. It uses interactive voice response as a key feature for centralized dictation services. This product is representative of the kinds of voice compression and processing achievable.
    • 5) Telephote Communications, 11722-D Sorrento Valley Road, San Diego CA 92121, makes an image compression product called "ALICE" that performs 15 to 1 compression with no loss of resolution. It is a software device and is designed to integrate into other products such as Teleconferencing, Picture Data Base and Surveillance. It claims to be able to send full color images over standard phone lines in less than 10 seconds, or store 4,000 high resolution color images on a 50 Mbyte hard disk. This product is representative of the kinds of compression levels achievable.
    • 6) Dataproducts New England, Barnes Park North, Wallinford CT 06492, makes a product called " DPMUX M-44 ". This device accepts channel input from a telephone switch using E & M signaling or data in baseband format from a digital device into a port. The voice signals are then converted to digital voice of the required bit rate or in the case of data, retimed to network timing and assigned a time slot on the aggregate network transmission side. The aggregate side is either frational or whole T1 service. This machine is configurable by an operator for the number of channels, bit rate assignment, timing assignments (asynchronous, synchronous, plesiochronous), and class of voice service. The concept of configurable by an operator and down loaded by aggregate frame to the distant end is extendable to automatic configurable generation by a controling algorithm, down loaded by an aggregate frame to the distant end.
    • 7) Network transmission equipment made by various manufacturers have functions that are applicable to the communication processor. These functions are time slot interchange or Digital Access Crossconnect Service (DACS) machines, and Time Assigned Speech Interpolation machines (TASI, voice data as an instantaneous bandwidth variable) such as the IDNX product from Network Equipment Technologies, 800 Saginaw Drive, Redwood City, CA 94063. This class of equipment fits data and voice in an adjustable format to make efficient use of the available bandwidth. This class of equipment is described as a transmission resource manager. The concept of managing transmission resources is applicable to the communication processor for the purpose of workstation video conferencing.
    • 8) Feature article published in the February 1991 issue of Telecommunications, The article is entitled " Steamlining Protocols " by William Stalling. The article discusses changes to transmission protocols for streamlining and improving transmission performance. The communication processor will use an adaptive protocol that will allow it to communicate with multiple type electronic digital transmission equipments to facilitate transmission control for the purpose of workstation video conferencing.
    • 9) Feature article published in the February 1991 issue of Telecommunications. The article is entitled " LAN Interconnections Technology "by Michael Grimshaw. The article discusses the differences between bridges, routers, repeaters and switches as used in local area networking and interfaces to the transmission network. The communication processor will perform network functions of bridging, routing, repeating and switching.
    • 10) Internal Telegraph and Telephone Consultative Committee (CCITT) IX Plenary Assembly - Document 52, Study Group VII-Report R43, Recommendation X.200 entitled REFERENCE MODEL OF OPEN SYSTEMS INTERCONNECTION FOR CCITT APPLICATIONS. This is also referred to as the 7 layer ISO model. The application-process as referred to in this document is considered video conferencing as performed by the communication processor.
    SUMMARY OF THE INVENTIONS
  • The invention can be used for telecommunications networks, with different channels and different tariff offerings to which the communications processor interfaces for and on behalf of the workstation users to provided the requested video conferencing services.
  • A workstation needs to communicate with other workstations with operator audio and video intelligence. A picture or video of the operator with audio would greatly enhance personal productivity by significantly improving person to person communications. The objectives of this disclosure are:
    • 1) Propose a low cost simple video interface approach from the workstation to the CommunicationProcessor. There is more than one solution for this element of design, however only one needs to be shown.
    • 2) Propose a simple video weighting algorithm that will give a perception of high quality without the need of high bandwidth. There is more than one weighting algorithm that would satisfy a weighting requirement. Since the weighting plan is proportional to transmission bandwidth and inversely proportional to display performance, the communication processor will select the appropriate weighting plan that gives the best performance under existing conditions. The use of an algorithm for selection of weighting algorithm selection is important.
    • 3) Propose an audio management algorithm that would include the use of several weighting standards that would allow the CommunicationProcessor to be compatible over several standards, permitting connection to various types of network circuits. The plan takes into account the type of service requested and service required at the distant end. It will make a selection based on a match between the two. This selection then becomes a criteria for item 2.
    • 4) Propose a network management algorithm for dynamic multiplex transmission on hierarchical T carriers down to the DS0 level, fiber optic media standards like SONET and FDDI. The DS0 level is the lowest common denominator for an audio/video channel because communications evolution requires compatibility with existing standards and DS0 is a predominate type interface. A key function of the communication processor is to assemble audio and video onto a small fraction of a T1 carrier as a minimum. When bandwidths permit higher carrier rates, the communication processor will use higher carrier rates.
    • 5) Artificial Intelligence (AI) for the system controls the final processing after using the transmission network as an input base for processing decisions. Factors such as connectivity, and activity are some of the criteria used to evaluate the possibility of connections. To these type factors are added user requests for service, capabilities of the workstation, and distant end user availability.
  • These objects are solved in general by the solution given in the independent claims.
  • Further advantageous embodiments of the present invention are laid down in the subclaims.
  • In order to achieve the above objects, a system for coupling workstation units in a system is provided which transmits audio and video information over a carrier, which has a communication processor having a digital bus for intercoupling elements coupled to the communication processor. The communication processor's network interface ports include a port for a network carrier signal and a port for a local loop carrier signal. The communication processor interconnects the various units on the network. A workstation interface, video processor and audio processor are interconnected to pass digital and analog signals therebetween and for passing digital information via the system digital bus. In control of the system, it is provided a channel frame processor connected to said digital bus for controlling communication over said digital bus. In addition on the system bus it is provided a statistical audio/video multiplexing processor connected to the digital bus for dynamically allocating bandwidth between audio and video information signals on the digital bus.
  • The audio processor has a voice compression/weighting subprocessor shared by several users coupled to the commuication processor system. In addition it has a way to perform compression and weighting based on an allocatable bandwidth provided by a user of the system and based upon a decision made by the communication processor to allocate final bandwith process control for the the requested bandwidth.
  • The video processor has a video compression/weighting subprocessor shared by several users coupled to the communication processor system. This provides for compressing video information and subject to control from the communication processor based on video activity and allocatable bandwidth and weighting provided by a user of the system and based upon a decision made by the communication processor to allocate final bandwidth process control for the the requested bandwidth.
  • The system is provided with a statistical audio/video multiplexing processor shared by several users of the system for performing modelling of the communications channel and usage situation and computing parameters for channel transmission in the system. The channel frame processor may be coupled to be shared among users for assembling a channel frame for insertion into an aggregate frame.
  • Further it is provided a composite super frame processor common to all users, one per port for implementing a language of the telecommunications interface.
  • The audio video communication processor system can execute artificial intelligence (AI) software common users to which all other system processor elements are subservient for synthesizing implementation parameters of digital language modulation conversion from one channel in an aggregate frame to reproduce the channel in a another aggregate frame and for setting up conditions of facsimile usage, for threading channels to establish connections between system elements for connection of workstations operation with said system.
  • These and other features will be described in connection with the following drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS.
  • FIGURE 1 shows the various classes of of transmission equipment in Venn diagram format. The intersection of the circles represent the transmission equipment, and the ellipse the communication processor. Since the communication processor m with various type network equipments employed by the network carrier to provide the tariff service, it must be able to select the appropriate algorithm with all its advantages and constraints. The communication processor must then use advantages and work within the constraints to formulate the channel packet according to the user requested service.
  • The communication processor must do the same kinds of things as the transmission equipment and at the same time add to those functions audio and video processing, it becomes a new class of equipment. In the telecommunications lexicon, a DCE is Data Communications Equipment, that which belongs to the telecommunications company. The DTE is the Data Terminal Equipment, that which belongs to the user of the telecommunications company's services. The distinction of DCE or DTE is applied to equipment as such. Since this is a new class of equipment, market place usage will ultimately determine if it is a DTE or DCE because it has elements of both. FIGURE 1 is called AUDIO VIDEO COMMUNICATIONS PROCESSOR, NEW MACHINE CLASS.
  • FIGURE 2 shows the communication processor interface at the workstation level. consists of a bezel which contains the electronics to make the forward video, and process transmit and receive voice, receive the feedback and distant end video. It communicates with the communication processor with control data and receives data for the user interface such as menu selections, and session data. Implied, but not shown are the software and interface adapter card which cause the workstation to interface the communication processor. The bezel does not interface the communication processor directly.
  • FIGURE 3 shows how a video weighting plan may be constructed for presentation as shown in the display area of FIGURE 2.
  • FIGURE 4 shows a channel frame consisting of three types of data. The figure is entitled FRAME PRIOR TO AGGREGATE ENCODING.
  • FIGURE 5 shows the data output structure on the aggregate port of the communication processor. This figure is entitled SUPER FRAME SCHEMATIC.
  • FIGURE 6 shows communication processor high level functions and interfaces. It can be entitled COMM PROCESSOR FUNCTIONAL DIAGRAM. As shown, this is the preferred functional embodiment. Looking towards the workstation, the communication proce would seem to be a DTE. On the Network side are network connections called Aggregate Ports. Out of each port an Aggregate Frame is transmitted. Each aggregate frame contains the channel frames from the workstations. A channel frame may come in on one port which does not have a destination at this point, but would be routed, or bridged back on to another port for another destination. The information required to do this must return back to the originating communication processor for proper set up and coordination.
  • FiGURE 7A shows the basic concept of how the communication processor functions which is connected to the workstation group and the telecommunications network. It is entitled NETWORK CONCEPT.
  • FIGURE 7B describes the preferred embodiment of the invention, which is then incorporated in larger system units as described in connection with the other drawings.
  • FIGURE 8 shows a unique image processing scheme that can be applied to the communication processor.
  • FIGURE 11 is an extension of FIGURE 3 with the addition of vertical and horizontal line numbers. The numbers shown are taken from IBM's VGA graphics specification. But the purpose of the figure is to stress, since each workstation is required to use a window, and a window is a fraction of a total viewing screen, that the bandwidth is related to video specification not driven by it. Furthermore, two disparate graphics specifications can video conference.
  • FIGURE 12 shows an idealistic bandpass frequency response of a focal plane image detector. Part of the video processing is the integration of frequency from low frequency limit to high limit into a single value representing the pel. The image is entitled DETECTOR FREQUENCY INTEGRATION SCALE.
  • FIGURE 13 shows the types of signals that would be interfaced between the communication processor and the workstation using the base technique shown in FIGURE 10. The FIGURE is entitled COMM PROCESSOR AUDIO INTERFACE. The purpose of this FIGURE is to show that complex multiplexing, or fiber optic arrangements are not required to perform this function. Simple frequency division multipexing on limited bandwidth RG cable is sufficient.
  • FIGURE 14 shows a possible approach to processing voice signals. The approach is to initially treat the voice as a 64 kBps PCM signal with either mu-law or a-law weighting prior to doing any compression or weighting on the signal, if the carrier is at T1 rates or higher and activity is light. If the bandwidth is not required to be reduced then the voice signal needs no further audio bandwidth reduction processing although the audio may still be subjected to TASI (Time Assigned Speech Interpolation) processing. TASI works with the video processing section in the trading of allocated bandwidth between audio and video. The figure is entitled AUDIO FORMULATION FOR A PACKET.
  • FIGURE 15 shows how voice may be replaced by a fax signal or text in the channel packet. Since fax is audio band representation of an image. A complex image may be transmitted from workstation to workstation using the fax invention, together with advantages and features, by way of example with reference to the following drawings.
  • FIGURE 16 illustrates how communication processors are interconnected.
  • FIGURE 17 illustrates changes in error bits with respect to block size.
  • DETAILED DESCRIPTION OF THE INVENTIONS.
  • Before considering our preferred embodiments in detail, it may be worthwhile to illustrate, by way of example, how the communication processor would function. A user (#1) at a workstation decides to request a video conference session with another user (#2). The user is at the same facility but in another building. Number 1 brings up a menu on the screen. The menu is interactive with a mouse (mouse: for the sake of argument) so that the user clicks on the "Who is Who" user directory. From there he selects the name of the individual he wishes to have a session with. The name is associated by list to a routing directory. The user also selects the size of the window for the receive video. The names in the directory window are associated with and mapped into the routing directory. Since the communication processors associates a route to user number 2 the communication processor associates a network port, and telecommunications network interface (ISO levels) for that port. The communication processor then tests the port for available bandwidth, queues the request in a jobs to do list and goes back to the user at the workstation with suggested window sizes.
  • Tagged to the window size is a high fidelity symbol. The high fidelity symbol represents voice quality which is the trade off to window size. As the user varies the size of the window from small to large, the high fidelity symbol varies from large to small. The video window is composed of two parts: one part is size and the other part is speed. The speed at which the video is updated is represented by rotating wheels, (motion picture camera reels). The communication processor controls the speed of the wheels by telling the workstation high range for window and speed that can be allowed at this time. The workstation through the applications software works the symbols and proportions them from a low value up to maximum. The faster the wheels spin the more natural the video, the slower the spin the less natural. Experience will guide the user to an acceptable value.
  • The communication processor will then initiate the session. This consists opening a channel to the other communication processor and forwarding a request for serive. The first communication processor has determined that bandwidth is available. The second communicaiton processor must determine if a second user #2 is available.
  • The improved video processing of the hardware system which is being described does, like U.S. Patent 4,862,264, allows the processing of video in blocks for the purpose of identifying activity of the block to make a decision as to transmit at full accuracy or reduced accuracy. The improvement allows one to always transmit at reduced rates to accommodate the next potential user. The reduced rates do not necessarily effect video quality if redundancy rather than information is eliminated in the transmission selection process. Thus. unlike U.S. Patent 4,862,264, the system allows the decision making process to be based on pixel numerical dynamic range. Video is always transmitted as binary numbers. The number of binary symbols per picture element is fixed by the A/D process, but the number of bits transmitted and their position in the number field is initially determined by the communication processor based on available bandwidth. The communication processor may elect not to transmit video but to devote the bandwidth to audio or data. The user may or may not notice any change to video quality as the intent is to reduce bandwidth requirements at the source by a controlled field of view and feedback controlled illumination.
  • It should be noted that the communication processor may be used to improve results in areas like that of US Patent 3,795,763. Memory is used to hold successive video frames and perform difference algorithms between them. Then it performs further processing, differently than what is described in the 3,795,763 patent. One example of differences is that pulse sync information is not transmitted because the workstation video hardware on the other end may have different video characteristics than the transmitting workstation. However, one should recognize that the communication processor processes workstation video, not the normal, television signals.
  • After the second communication processor has determined that user #2 is available the processor proceeds.
  • Each workstation can set receive status. If a workstation is busy doing an application and the user has set receive status to "NO" then the communication processor will deny access to the first communication processor with a busy reason. User #1 is also informed. If user #2 has set receive status to "YES" then a query comes through on the screen. If the user responds positively to the query before a predetermined amount of time has elapsed, then a session is initiated. The parameters for workstation #2 are those of workstation #1, so both have the same fidelity and window size. The sessions can also be customized so that user number 2 with a higher performance workstation can have a larger or better quality video window. The customized parameters are set by the user again through menus. The reply to a receive status also includes information about the size of the window defined in a customized list, if available.
  • This is a simple case because the second user was at the same facility although in different buildings. In a situation such as this, it is expected that the telecommunications network might not be used, but rather a dedicated optical or wire cable with FDDI type bandwidths. If the second user was not co-located with the first then the telecommunications network then becomes the vehicle for carrying the traffic. In this second situation there are more interactions with more types of machines. The communication processors most likely will communicate through intermediaries. The communication processor must be able to speak the proper la such as 2B1Q, or AMI at the physical (reference ISO, CCITT X.200; x. etc) level and up through the network layer.
  • The communication processor can act as an intermediary between two other communication processors wishing to communicate. This is a required function because as yet there are no transmission equipment that can interact with the communication processor for purpose of providing video conferencing. In a situation where a communication processor acts as network equipment, it may be called on to do routing or bridging between two separate transmission circuits. The communication processor may be required to translate from 2B1Q to AMI as an example. The rules for the translation are embedded in the artificial intelligence software. Where rules do not exist, the communication processor will not attempt a route or bridge. This area is exceeding complex and no further explanation is attempted in this document.
  • Cost is a major factor in the decision to implement new technologies. The cost of providing video and audio communications from workstations can be significantly reduced by providing a common communication processor to serve several users. The cost is reduced by using the processing power present in workstations for display, and control. The communication processor is not in to compete against Token Ring or Ethernet Local Area Networks (LANs) but rather perform a different service by providing low cost audio and video communications within the facility area and for long haul transmission. The purpose of the communication processor is to provide the best quality audio and communications possible given the time variable constraints of the transmission medium and instantaneous degree of loading or usage. The communication processo not rely on specific standards for audio or video, rather the bandwidth, resolution and transmission rate are adjustable to fit the constraints at the time a request for service is made.
  • A workstation initiates a request for service. The user does this by brining up on the screen a software driven menu. The menu parameters are determined by the communication processor. When the parameters are chosen, they become in aggregate a request for service. A request for service includes data about the nature or type of service and signal destination. This information is sufficient for the communication processor to attempt several routing threads before affirmative determination can be made. If an affirmative determination is not possible, then the communication processor will determine what is possible and suggest those possibilities to the user.
  • The main concept of this design is the integration of several disciplines using a system engineering approach. Through the judicious adherences to multiple communication standards and application of advances in high speed signal processing, optics and light wave processing, a low cost communications processor is feasible.
  • The Preferred Embodiment
  • Turning now to our invention in greater detail, it will be seen from FIGURE 1 that illustrates our preferred embodiment in which the communication processor is specified for processing workstation audio and video using artificial intelligence techniques to use the telecommunications network. It is this use of the telecommunications network that will allow for the first time a simple effective means for personal video conferencing. The communication processor is collection of existing technologies, improved and modified to work together, toward a common goal. The most important functions are network type and shown in FIGURE 1. The evolution of the network has been towards increasing bandwidth. This has been true for some time, however there has also been an evolution toward providing digital bandwidth type services in which the customer interfaces the network at various layers. Typically referred to as ISO or International Standards Organization seven layer structure, the lower layers have been specified for basic control of network equipments for access to the physical, data link, network, and transport network functions. The ISO specification for access to physical, data link, and network are embedded in the communication processor particularly the AI, for communicating to those elements network that provide these functions. Beginning with the transport layer and ascending, the communication processor will communicate only to other communication processors to provide this function. Although it may be possible at a latter time to structure the transport layer portion of the communication processor to work with telecommunications network equipment at higher layers, for now the lower physical, data link, and network layer functions are embedded in the artificial intelligence portion of the communication processor. The three lowest layers, using definitions of the CCITT are designed into the communication processor for communicating to telecommunications network equipment. Furthermore the functions assigned to these layers are employed by artificial intelligence software.
  • The communication processor basic components are:
    • 1) Voice Compression/Weighting Subprocessor shared by several users. The algorithm chosen is based on allocatable bandwidth. This sharing is vital because it provides for key bandwidth process control together with the communication processor.
    • 2) Video Compression/Weighting Subprocessor shared by several users. The compression algorithm and weighting process are subject to control from the communication processor based on voice activity and allocatable bandwidth. The subprocessor works in conjunction with component 1, the audio subprocessor.
    • 3) Video Imaging Device (may be embedded in the bezel). Works with the communication processor through the interface boardset for the workstation and associated software.
    • 4) voice Encoding Processor (may be embedded in the bezel). Provides digital voice capability at either 64,000 bits per second PCM or 32 or 16 Kbps ADPCM or LPC bidirectionaly or full duplex.
    • 5) Statistical Audio/Video Multiplexing Processor shared by several users. This function is driven by AI software in the communication processor. It essen performs modeling of the communications channel and usage situation. It computes parameters for the channel and aggregate processor. This is the processor that makes use of collected data such as error free seconds for changing error correcting codes for the purpose of minimizing overhead.
    • 6) Channel Frame Processor may be common or may be shared among users. Part of the communication processor responsible for assembling the channel frame for insertion into the aggregate frame. It implements the results of the statistical audio/video multiplexing processor. This function is driven by AI software in the communication processor
    • 7) Composite Super Frame Processor common to all users, one per port This function is driven by AI software in the communication processor. This processor implements the language of the telecommunications interface such as FDDI, 2B1Q or AMI or such types.
    • 8) CommunicationProcessor running artificial intelligence (AI) software common to all users, and for which all other processors are subservient to. This processor is responsible for synthesizing implementation parameters of digital language modulation conversion from one channel in an aggregate frame to reproduce the channel in a another aggregate frame. It is also responsible for setting up conditions of facsimile usage.
  • Where as the products described in Reference items 2 through 5 are independent (not related), this effort undertakes to define a specification between them. But not in a unique product sense but rather in a specific functional sense. These products represent what is achievable with current technology. Given what is available, the communication processor can be developed from these or similar kinds of products.
  • Items 6 and 7 are made to relate in a unique way, with the exception that TASI equipments have has their objective to increase the apparent transmission bandwidth by allowing more voice channels to be assigned to a digital carrier, than those allowed by fixed multiplexing schemes. For example the T-1 carrier carries 24 fixed channels of 64,000 bps, but if it was TASI'ed then it could carry as much as 32 or more channels. Service quality is degraded by TASI and data transmissions are seriously impaired. The space or inactivity periods associated with speech is used to transmit extra capacity voice channels. The aggregate frame structure contains the information at the distant end to reconstruct the voice channels in proper order. The difference is that a fixed 8 bits used to encode the voice signal is dynamically reduced from 8 down low as 4 depending on the dynamic range and activity. When bauded data is substituted for voice, the TASI algorithms cause enough signal distortion noise that data is destroyed. Hence it is incumbent on users to know how their data circuits are handled by the telecommunications carriers.
  • Circuits bought under a specific tariff may not necessarily be used for other purposes. The communication processor fits into this category. Since AI coding is specific for each tariff circuit, no substitutions can be made. A communication processor port designed for ISDN (2B1Q) B channel cannot be substituted for a 128,000 bps channel from a T-1 programmable multiplexer, even though the nominal rates are the same.
  • First Element of Distinction, Network Architecture Several workstations are connected to the communication processor for the purpose of providing video conferencing. The video conferencing may be local in origination and destination or remote in destination. Remote destinations require a communication processor and workstation set up similar but not identical to the originating units. Workstations on the same communication processor can video conference as well. The workstation runs software to perform several tasks in conjunction with the communication processor. These five tasks are:
    • 1) Prompt the user for the type of service to be attempted through the use of menus. The information to be gathered is of two kinds:
      • a) General class of video service desired such as window size, and low to high refresh rates.
      • b) General class of audio service desired such as low to high audio clarity and none to one second delay.

      Associated with the window size is a high fidelity symbol. The high fidelity symbol is an icon of a gramaphone. The high fidelity symbol represents voice quality which is the trade off to window size. As the user varies the size of the window from small to large, the high fidelity symbol varies from large to small. The video window is composed of two parts: one part is size and the other part is speed. The speed at which the video is updated is represented by rotating wheels, (motion picture camera reels). The video symbol is a motion picture camera with reels on top of a box, and lens to one side of the box. The side of the camera is a box which is variable in size in which the receive video will be located. The communication processor controls speed of the wheels by telling the workstation the high range for window and speed that can be allowed at this time. The workstation through the applications software works the symbols and proportions them from a low value up to maximum. The faster the wheels spin the more natural the video, the slower the spin the less natural. Experience will guide the user to an acceptable value. If sufficient bandwidth is available there may be little or no interaction between the high fidelity symbol, reels and window size. If bandwidth is small, a small movement will cause a very large interaction quickly. The communication processor controls the size of the symbols at all times and the user cannot request or cause an impossible service. An impossible service would be all video and no voice or the reverse or any situation not programmed into the artificial intelligence software.
      Audio service also requires some attention. As part of the trade in audio quality, fidelity is traded with video quality. Inherent in this trade is the synchronization of voice with the video. The communication processor will always attempt to sync voice with video. But inherent in the speed of the reels is the synchronization of voice with video. The communication processor will permit one second of difference between voice and video syhchronization.
    • 2) Prompt the user for the telephone number associated with the remote end workstation. Video conference routing uses telephone numbers along with specific names and passwords and time based call acceptance criteria. The communication processors communicate this information among themselves in the super frame in order to establish connection instruction tables. The telephone numbers are important because there is a large investment in the telephone number data base. The communication processors are programmed with the identity of each user service. This information is actively shared among communication processors. Part of this information is the users receive status. The communication processor will not attempt a call if the receive status is set negative. Since the communication processor network is an important part of user directory, the "Who is Who" directory first starts out with a geographic map showing locations of user concentrations. A software controlled hook onto one of those locations will bring up the names and telephone numbers of users at that point. There will also be a receive status indicator to keep users from wasting effort. This information is shared among communication processors using idle Super Frame aggregate capacity.
    Second Element of Distinction, Workstation Video Specifications
  • An approach, with potential for low cost, is to split the camera function of optics and electronics for the purpose of locating camera electronics at the Communication Processor. A basic camera to convert image to signal contains two essential components, focusing optics and image conversion electronics. Such a camera may be implemented by using focusing optics in conjunction with a fiber optic focal plane image collector located at the Communication Processor. The purpose of this approach is to use one set of high speed electronics to service multiple users. If the imaging device is fast, then it can service several users simultaneously. At the Communication Processor, the weighting algorithm and multiplexing algorithm can be applied simultaneously. FIGURE 2 entitled Workstation Video Input shows the interface between a video collection circuits located at the workstation location. The imaging lens and video collection system is something on the order of an Videoendoscope or modification thereof. To keep the depth small, the primary lens can be designed to interface a right angle mirror. The mirror will allow the required focal length to be obtained along the width of the bezel. This way the bezel depth can be made small. The imaging device or electronic focal plane is then at right angles to the image. There is also the possibility that electronic focal plane need not be present in the bezel, but located in the communication processor. The imaging lens and collection system then transmits the image in optical form in optical guide to the communication processor. Both techniques will work, but when implementation cost is considered, the electronic focal plane in the bezel would seem to be the preferred approach, but the second approach is further developed in the alternative approach section.
  • Returning to the preferred approach, consider the possibility of a conventional approach using a miniature camera device such as that in item 3 in a bezel located at the workstation display. In this case a single wireline cable is all that is required for the interface between the workstation and the Communication Processor. The bandwidth of an average RF type cable is sufficient to support frequency division multiplexing of send and receive video, audio, and data. FIGURE 10 shows that when this approach is used, the interface to the communication processor is simplified con the other types of signals that share the cable.
  • One key to the invention is the video processor weighting algorithm required to preserve picture quality and conserve transmission bandwidth. Compression is required in addition to weighting. Compression and weighting are not mutually exclusive, but must be designed together for compatibility. Consider the following; if an image is divided up into scan zones with a unique central zone defined as the area where the eye spends more than 50% of the time focused, and a unique concentric peripheral zone is defined as the area where the pupil spends 35% of the time focused. The remainder concentric peripheral zone is the remainder 15%. FIGURE 11 is a graphical representation of an applicable weighting scheme.
  • The image frame which consists of N x J number of vertical and horizontal scan lines is digitized into M bits of resolution. For the sake of example, M is represented by the numerical value 12, and N = J = 525 picture elements. Then each picture element is represented by 12 bits of resolution. The total number of bits is (N * J * M) or 3145728 bits. At this rate, a serial channel running at the rate of 1.544 Mbps would require just over 2 seconds to transmit If compression is able to reduce this value by 15 times, then the value is 209715. The first video frame is transmitted as compressed only as there is no prior frame to compare to. The next frame is weighted against the first by subtracting it from the previous frame, in a picture element by picture element comparison. If the resultant value is less than some predefined delta amount, the resultant is zero for that picture element. The position of the first picture element not equal to zero is noted as (x,y) of (N,J). The position of the next picture element at location (x± ,y+1) should also be non zero. Sigma, is the deviation computed from the next r scan lines of non zero picture element comparisons where r times s is about 50% of the central portion of the picture, and 50% of the total number of picture elements maximum or less if computed so. Therefore (x,y) to (x+s,y+r) is the central portion of the picture. But to compute a starting point for transmission of 12 bit numbers, x- is used for the starting row value and y is chosen for the column value. The last value in this sequence is x+s and y+r. The total number of bits is then (256 * 256 * 12) or 786432 bits maximum value or less if comparisons produce zero values before 50% of the central portion of the picture occurs first. If compression is applied next, then the maximum number of bits is reduced by magnitude 15 (average compression possible) to 52,428. Since we pick DS0 as the base carrier rate 11,570 Bps is available for voice and overhead. FIGURE 11 shows a possible weighting algorithm.
  • Once in digital format, video weighting is applied. In this algorithm the assumption is that the video is 640 pels by 480 pels, but could be any number. The combined number of pels is 307,200. Of this number, by arbitrary definition, 50% are given 12 bits of resolution, 35% are given 8 bits of resolution, and 15% are given 4 bits of resolution. The 12, 8 and 4 were chosen to exaggerate the weighting description for purposes of providing a clear explanation of the concept, there could be several such values used by the communication processor. The picture is transmitted in gray scale with the portion showing 4096 shades, and 256, and 16 shades for the remaining portions. Each area is compressed using image compression algorithm with boundary dependency.
  • Compression factors of 50 to 100 to one are possible in real time, but are not required in real time for the communications processor. Video may presented as series of images updated at some rate proportional to the request, bandwidth cost and system loading condition of service. Of keen importance to user acceptance is the video signal to noise ratio. To enhance the signal to noise at the source, a technique is shown using low power laser in the short to medium short IR wavelength region to illuminate the subject.
  • Illumination using fresnel lens dispersion is a good way to maintain a high photometry value while maintaining a low radiometry level. It is recognized that subject eye safety can be assured if the radiometry can be kept low by virtue of the fresnel dispersion lens. At the same time, the detector frequency response should be as wide as possible, from visible to medium short wave length IR. FIGURE 6 shows the detector frequency integration scale illustrating the effect of integrating all the wavelengths shown. In practice, imaging systems do not try and control illumination except in the broad sense of having enough light to make a good image. The spectral content and short term behavior play an important part of image processing. The goal of image illumination is to eliminate illumination vagrancies such as power line AC characteristics, and florescent lighting blue / green spectral content that would tend to create more image processing work for the communication processor. Office illumination dependent on AC power and florescent type lights for the primary source.
  • The integration of the amplitude over the frequency bandwidth will result in a signal with a high signal to noise value. This signal is reproduced at the distant end as a black and white gray scale video. Strictly speaking white is not a monochromatic color, but in this case can be considered such because it is a single value (area under the curve of FIGURE 12) function of amplitude only. This approach was simplified for space and time reasons. It is recognized that the final signal is a convolution of the source illumination function, fiber optic band pass response, and the detector bandwidth.
  • Third Element of Distinction, Audio Management Specification
  • The Communication Processor may be connected to the workstation over high quality coaxial cable. The cable is split into several frequency management units using frequency division multiplexing. This will permit the cable to be loaded with duplex voice, receive simplex video with feedback, duplex data, and duplex network control data. FIGURE 14 shows how voice is processed at the pso level.
  • The FIGURE 7A shows the general configuration of an overall network of the unit of the system for the audio video communication processor shown in FIGURE 7B. Generally, it will be seen that the system provides one or more communications processors to service a group of workstations with audio and video transmission processing for the purpose of providing video conferencing.
  • The communication processor utilizes artificial intelligence software to read the connection. Conversion rules contained in tables so that the system can react to the communications environment. The system is coupled for processing optical signals for low cost communication and video conferencing with audio and video communications within the facility area and for long haul transmission. The communication processor provides audio and video communications under instantaneous constraints of the transmission medium and instantaneous degree of loading or usage. Bandwidth, resolution and transmission rate are adjustable to fit the constraints at the time a request for service is made. A workstation initiates a request for service. A request for service includes data about the nature or type of service and signal destination. This information is sufficient for the communication processor to make several attempts to threads before an affirmative determination can be made. If an affirmative determination is not possible, then the communication processor will determine which is possible and provide an output to the user for possible changes in a request. With respect to FIGURE 7B it will be seen that the audio video communication processor has network ports, a local port, and the communication processor unit of the system which is provided by the additional units connected to the system digital bus. Connected to the bus is a composite super frame processor, a statistical audio/video multiplexing processor, a channel frame processor and the workstation subprocessors which include the video processor with weighting and compression, the audio processor with compression and weighting and the workstation interface circuits. These workstation subprocessors have both analog and digital interconnections. The system has a telephone type interface port for connection of workstations as part of the overall system.
  • The telephone type interface is a unique concept in that it allows stand alone devices such as facsimile to interface with workstations. Workstation can run fax emulation software by which documents in electronic form are converted to a fax signal and sent to distant facsimile equipment. The telephone interface is used to support packetized audio on the network. The telephone type interface can accept telephone signaling information and originate signaling information.
  • Workstation voice signals are accepted by the communications processor. One of N voice input lines is accepted by multiplexing for processing. Sampling is performed at 64 Kbps to be compatible with existing practice (first level DS0 carrier). The weighting can be either mu law or A law. The CommunicationProces automatically perform conversion between mu law and A law when the routing table description indicates conversion is necessary as in the case of transatlantic videoconferencing. After weighting, the signal is processed further to lower the transmission rate and assemble the resultant data into a numerical sequence suitable for a packetized network. The rate at which the packets are transmitted is dependent on network loading. The Communication Processor network is not intended to compete against the telephone network, but rather to add voice to video, but the option to select voice only is included.
  • A note about telephone type voice bandwidth. The rate of 64,000 Bps is the result of a Nyquest sampling rate of 8,000 samples per second. Each sample is 8 bits of binary data. Two to the eighth power is the decimal value 256. Since the electrical voice signal should be symmetrical around the zero voltage axis, 256 is partitioned into +128, -127 and 0. The sum of these position holding numbers is 256. This means that a full 4,000 hertz of signal bandwidth is available for voice. The type 500 handset commonly used in a telephone apparatus has a nominal bandwidth of 3,000 hertz. The first limiting filter is then the handset. A handset or headset that is part of the communication proces group will not be so limited, it will have a nominal 4,000 hertz high fidelity voice bandwidth. FIGURE 14 shows how a packet for audio would be formulated for video conferencing. The process labeled Linear Predictive Code Compression (LPC) can reduce the voice data bandwidth to as low al 1,200 Bps serial rate. This voice has poor quality concerning elements such as speaker recognition and lost words causing speech to be repeated when the transmission medium performs poorly. Since the medium is expected to have high quality, the only degradation in 1,200 Bps speech is from the algorithm. Voice will be processed at 2,400 Bps LPC and 16,000 or 32,000 Bps Adaptive Differential Pulse Code Modulation (ADPCM) as well as 64,000 Bps PCM. The choice depends on the available bandwidth, the user request, and compatibility at the distant end. These factors are kept track of by the artificial intelligence software running on the communication processor. FIGURE 4 shows how variable partitioning is used. Variable partitioning reduces redundant processing and saves bandwidth space.
  • The audio can be replaced by fax signal, when facsimile is requested. The bezel, or workstation adapter has an input for fax signals. The communication processor does not generate fax signals, but one side of the workstation, not related to the communication processor interface adapter, may generate them. Dynamic allocation of bandwidth is based on audio as the first priority when the signal is fax, given the network loading. Periods of audio inactivity are used to transmit video. Audio is dynamically compressed from the 64,000 Bps using a good predictive compression algorithm. Voice data rate expectations are 2,400 to 1,200 Bps. Audio compression may also be zero, so the rate would DS0 or 64,000 Bps as would be the case for fax signals. The workstation packet control section is likewise dynamic. If the information is redundant, previous packet repeat semaphores are used to convey this information.
  • The CommunicationProcessor is required to collect all the fax packets before sending them on to the workstation. Fax packets are not permitted to be discarded as is possible in audio or video packets. A buffer may be required for this function. Voice processing requires a buffer also. The video signal does not require a buffer. A video packet may be discarded if required. The workstation will not assemble a new frame unless the frame is complete and intact. It is recognized that there is a need to enhance the hardware for video images among a plurality of teleconferencing stations over telephone lines, something that U.S. Patent 4,682,225 did not address.
  • The communication processor with the video processor discards picture elements or pels. They are discarded because the receiving station has requested a window size for the video. Pels that do not have an address in the window as determined by an addressing algorithm are discarded. That is the first step in reducing unnecessary bandwidth. All pels that have an address in the distant end window will be processed by compression or weighting. No single representative pel is used to represent a group of pels.
  • Fourth Element of Distinction, Network Management Specifications
  • The key to CommunicationProcessor concept is to embed network standards compatibility between various transmission systems (standards bridge). The T carrier system is common and tariffs are available for this service. T carriers may be used in ISDN networks. The Synchronous Optical Network (SONET) uses a different multiplex scheme and which could also carry ISDN signals. The lowest level is OC-1 at 51.84 Mbps. This level may be used for the local area transmission access (LATA) loop or as a feed long haul networks. It will provide a great deal of capacity for high quality communications and is the preferred carrier. The Communication Processor must be compatible with signalling techniques such as DS2/DS3. The Communication Processor will not interface to a telephone switch at any level, but wide area network transmission equipment may look for the signaling pattern and could substitute network routing information momentarily. The artificial intelligence software by virtue of knowledge of tariffs will know when a carrier is capable of this and thus not permit crucial control information to allocated those bit positions that are most susceptible. This feature is also true of the ISDN Bearer channel.
  • For local loop operations, those co-located at a facility, an FDDI carrier is the preferred choice. Reliance on a telecommunications network carrier would not be required for this choice. FIGURE 16 shows how the local loop and the telecommunications network differ. Also it is possible for regional carrier to offer a 64,000 Bps carrier from the telephone system but not a telephone circuit. Such a circuit would most likely be inserted and dropped by a DACS machine prior to the T-1 carrier connecting to the telephone switches.
  • Consider the connectivity case of FIGURE 16. The instantaneous constraints of CP2 to CP4 are dramatically different from those of CP3 to CP5. The services available from CP2 to CP4 are correspondingly reduced. CP2 may be using a dedicated DS0 circuit. Other possibilities are an ISDN interface. Each has its own characteristics, throughputs and protocol. The Communication Processor is capable of interfacing each on the network aggregate side. The key to this is the artifical intelligence software which keeps track of which tariff offering is connected to which network port with all its capability and limitations.
  • The Communication Processor must also maintain circuit information, a history file network circuit attributes such as bit error rate and error free seconds in order to select an error correcting code that will result in the least transmission overhead. The code that is selected is sent in the packet control portion of the frame. Expert rules are made to take into account error statistics. If the bit error rate is at least 10-5 or better and or if the error free seconds is 10 or better then, ECC may not be used. Note that the distribution of error has more impact than the magnitude of BER in selecting a suitable ECC. Refer to Figure 17. The Communication Processor is opportunistic, an analogy similar to meteor burst communications, the Communication Processor can take advantage of good propagation time to up the transmission rate by reducing the overhead. FIGURE 17 is an example of error data statistics which could be collected by the communication processor. FIGURE 17 shows how the size of a block can effect the quality of transmission. Automatic repeat request is a parameter that can be measured. Also a given protocol can keep track of the number of times it must intervene to replace an error bit. The communication processor can keep track of these statistics and make changes to the protocol selection for maximum data bandwidth and reasonably good quality.
  • Expert communications processing (AI) will require consideration of error statistics. A key factor of throughput performance is connection quality. Bit error rate alone is insufficient to select a suitable error correcting code (ECC). Error free seconds (EFS) and error distribution over packet size are required in addition to BER. It is important that the Communication Processor measure these values and keep a history of them per network port. Selection of block size would also be decided in conjunction with ECC.
  • Consider the following example for illustration. A derivative key factor to compute is the through put per block size, which must includes factoring in the dynamic block growth from adding in ECC. In FIGURE 17 for example, the block size grows geometrically to add in a candidate ECC (other ECCs may grow arithmetically, or linearly). In FIGURE 17 the history of a network port is shown. The statistics are such that on a 50% average, a 116k size block contains one sequence of 7 continuous erred bits. That means one of two blocks is corrupted. If an ECC has performance such that 6 erred bits can be rectified in any sequence including continuous sequence, then that ECC would be chosen along with a smaller 48k block size as suitable for present conditions. ECC is only applied against control information. Audio or Video data do not receive ECC. However the statistics of error for the control information apply equally well to voice and video data when computing block size. When the situation is such that ECC is no longer effective, then dynamic evaluation of the connection is required such that the call may be terminated prior to normal service completion on account of unsatisfactory performance. A call may be attempted several times per hour with the intention of collecting statistical information when bandwidth space is available. These call attempts are not necessarily initiated by a user, but may be based on prior usage patterns. The statistics of the attempts are stored and used in calculating success of ECC types as a probability of success for future attempts.
  • A summary of the network management plan shows the key plan elements.
    • 1) Descriptor table for each network port side interface, all parameters and characteristics of the tariff service.
    • 2) Jobs in service table which contains all the parameters effecting service.
    • 3) Request for service table (proposed in service channels), queued by time of arrival, bandwidth request and complexity of connection.
    • 4) Quality performance table (contains measured performance statistics and computed statistics) for each network port.
    • 5) Computation of required services, and customized requests for each user in a performance table that will effect each request in a table (for adding new channels).
    • 6) Computation of limits of performance table of aggregate frame per network interface port. (System stops)
    • 7) Model of each of the current aggregate ports along with data collected for the purpose of projection bandwidth requirements and for computing bandwidth allocations. Develops all the primary parameters required by the subprocessors.
    Other Features
  • It is a feature of the invention that the communication processor shares the video subprocessor in the communication processor with several users. method of sharing is not important, for example analog switching into one A/D or discrete A/D per workstation user connection. However the processing of video using weighting or compression of several users is performed by one communication processor with the subordinate help of a video processor for all users.
  • The communication processor uses a video subprocessor to perform the work of selecting the picture elements from each user according to the specifications determined by the communication processor. The video subprocessor always strives from maximum bandwidth reduction so as to leave transmission capacity available for the next potential user. The video is either compressed or weighted depending on conditions. Weighted means that the allocated dynamic range of the picture element is reduced to an active dynamic range which in practice will be less bits the full allocated dynamic range. For example if 12 bit positions repres the full range and the pel exhibits range fluctuations of 4 bit positions then only the 4 bits is transmitted. A special protocol is used to tell the receiving side which 4 out of the allocated 12 are being transmitted. Each pel also has a specific address in the workstation video memory plane. Therefore no information is required to be transmitted for vertical or horizontal synchronization. The communication processor has little need to worry about compression or the techniques for such compresssion. Compression can be achieved by hardware or by software, and the kind of compression is not important to the invention.
  • It should be noted also that unlike U.S. Patent 4,733,299, the system has traffic types that are same. The communication processor is not concerned with progressive or interlac scanned video images, but rather video data. Some workstation may be using interlaced video and others progressive video. All the users of the communicati processor are performing video conferencing regardless of scan type. The basic video conferencing structure of the combined side frame is the same for all users with the exception that some users may have more bandwidth allocated to them that others. There are several reasons for this, but when a connection is made, the bandwidth is allocated for the duration of the session. There may be momentary variations in the bandwidth of either the audio or video, but they will be not a factor to the user. Each user has a channel in a combined side frame which can only be decoded by another communication processor. The communication processor will ideally use high speed digital carriers like T1 and other type carriers on the combined side, but they cannot be processed as DS1, DS2, or DS3 carriers by transmission equipment such as a Digital Access cross Connect to connect to voice or data switches. They cannot be processed as variable fractional T1 service either. If the combined side service is fractional T1, then it must be a fixed fractional service.
  • Consider the possibility of series of lenses is used to focus the image onto the end of a fiber optic cable connected to the Communication Processor. The quality of the fiber is distinct from those that carry high speed binary data. The fiber is required to maintain the focal plane rectilinear spatial image without precise regard to time dispersion along the Z-axis. There are in existence fiber lenses that exhibit a large focal plane then gradually taper the lens diameter into the cable diameter. The cable terminates at the Communication Prococessor into a lens array and electronic shutter. The shutter is positioned to project the image onto a facet of a multi- plane input, single plane out lens (complex prism). The output of the complex prism is a focusing lens array into the focal plane detector. FIGURE 11 shows weighting.
  • The Video Controller selects the electronic shutter in multiplex sequence and collects the image from the Focal Plane Detector (CCD) in analog form and converts the signal to digital format. CCD high speed performance makes possible the cost advantage of servicing several users asynchronously, but isn't necessary to Communication Processor performance.

Claims (14)

  1. An audio video communication processor system for coupling workstation units in a system which transmits audio and video information over a carrier, comprising:
       a communication processor having a digital bus for intercoupling elements coupled to the communication processor;
       said communication processor having a plurality of network interface ports including a port for a network carrier signal and a port for a local loop carrier signal,
       said communication processor having means for transmitting information carried by the port carrier signals from one network port to another network port and from a coupled workstation to a network port,
       a workstation interface, a video processor and an audio processor for processing video and audio information at a workstation level, said workstation interface, said video processor and said audio processor being interconnected to pass digital and analog signals therebetween and for passing digital information via said digital bus to said communications processor;
       a channel frame processor connected to said digital bus for controlling communication over said digital bus,
       a statistical audio/video multiplexing processor connected to said digital bus for dynamically allocating bandwidth between audio and video information signals on the digital bus.
  2. An audio video communication processor system according to claim 1 wherein the audio processor has a voice compression/weighting subprocessor shared by several users coupled to the communication processor system.
  3. An audio video communication processor system according to claim 1 or 2, wherein the audio processor has means for performing compression and weighting based on an allocatable bandwidth provided by a user of the system and based upon a decision made by the communication processor to allocate final bandwith process control for the the requested bandwidth.
  4. An audio video communication processor system according to claim 1, 2 or 3, where the video processor has a video compression/weighting subprocessor shared by several users coupled to the communication processor system.
  5. An audio video communication processor system according to claim 1, 2, 3 or 4, wherein the video processor has means for compressing video information and subject to control from the communication processor based on video activity and allocatable bandwidth and weighting provided by a user of the system and based upon a decision made by the communication processor to allocate final bandwidth process control for the the requested bandwidth.
  6. An audio video communication processor system according to claim 1 to 5, further comprising a image capture camera for outputting digital video to the communication processor through the workstation interface and/or
    further comprising a voice encoding processor for providing digital voice capability and/or
    further comprising a statistical audio/video multiplexing processor shared by several users of the system for performing modelling of the communications channel and usage situation and computing parameters for channel transmission in the system.
  7. An audio video communication processor system according to claim 1 to 6, wherein said channel frame processor may be coupled to be shared among users for assembling a channel frame for insertion into an aggregate frame.
  8. An audio video communication processor system according to claim 1 to 7, further comprising a composite super frame Processor common to all users, one per port for implementing a language of the telecommunications interface.
  9. An audio video communication processor system according to claim 1 to 8, wherein said communication processor has means for executing artificial intelligence (AI) software common users to which all other system processor elements are subservient for synthesizing implementation parameters of digital language modulation conversion from one channel in an aggregate frame to reproduce the channel in a another aggregate frame and for setting up conditions of facsimile usage, for threading channels to establish connections between system elements for connection of workstations operation with said system.
  10. An audio video communication processor system for coupling workstation units in a network system which transmits and receives audio and video information over a carrier, comprising:
       a communication processor having a digital bus for intercoupling elements coupled to the communication processor;
       means for receiving a different network carrier signals,
       means for transmitting and receiving audio and video information over the network system,
       said means for transmitting and recieving audio and video information including means for translating from one network carrier signal type to another network carrier signal type, and
       means for coordinating video frame information with audio and control information to interrelate the audio and video information transmittal under request of a user and under control of the communications processor.
  11. An audio video communication processor system for coupling workstation units in a network system which transmits and receives audio and video information over a carrier, comprising:
       a communication processor having a digital bus for intercoupling elements coupled to the communication processor;
       means for receiving a different network carrier signals,
       means for transmitting and receiving audio and video information over the network system,
       said means for transmitting and recieving audio and video information including means for translating from one network carrier signal type to another network carrier signal type, and
       means for coordinating video frame information with audio and control information to interrelate the audio and video information transmittal under request of a user and under control of the communications processor to intercouple the different elements of the system for common interconnected usage.
  12. An audio video communication processor system according to claim 11, further comprising: a workstation console audio video interface and/or
    further comprising: a video weighting subprocessor and/or
    further comprising: a weighting and compression subprocessor for audio and video signals and/or
    further comprising: a weighting and compression subprocessor for audio and video signals coupled to the communication processor system for frame encoding and for allocating audio with the video frame signal and/or
    further comprising: a means for allocating information relating to audio and video signals to a channel location as a super frame.
  13. An audio video communication processor system according to claim 11 or 12, wherein the communication processor has a feedback control loop to an illumination device of a display device for the workstation for controlling signal to noice at the source.
  14. An audio video communication processor system according to claim 11, 12 or 13, wherein the communication processor has a look up table for providing the knowledge necessary for the communication processor to decide the parameters under which it will accept and implement a user request for service.
EP19930111135 1992-07-29 1993-07-12 Audio/video communications processor Expired - Lifetime EP0581101B1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US07921536 US5392223A (en) 1992-07-29 1992-07-29 Audio/video communications processor
US921536 1992-07-29

Publications (2)

Publication Number Publication Date
EP0581101A1 true true EP0581101A1 (en) 1994-02-02
EP0581101B1 EP0581101B1 (en) 1999-02-03

Family

ID=25445583

Family Applications (1)

Application Number Title Priority Date Filing Date
EP19930111135 Expired - Lifetime EP0581101B1 (en) 1992-07-29 1993-07-12 Audio/video communications processor

Country Status (5)

Country Link
US (1) US5392223A (en)
EP (1) EP0581101B1 (en)
JP (1) JP3061981B2 (en)
CA (1) CA2096160C (en)
DE (2) DE69323357T2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1998056177A1 (en) * 1997-06-02 1998-12-10 Northern Telecom Limited Dynamic selection of media streams for display
FR2794326A1 (en) * 1999-05-31 2000-12-01 Canon Europa Nv Method of updating processing peripherals on communication network

Families Citing this family (182)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5495284A (en) 1993-03-12 1996-02-27 Katz; Ronald A. Scheduling and processing system for telephone video communication
US7019770B1 (en) * 1993-03-12 2006-03-28 Telebuyer, Llc Videophone system for scrutiny monitoring with computer control
US6323894B1 (en) * 1993-03-12 2001-11-27 Telebuyer, Llc Commercial product routing system with video vending capability
US20030185356A1 (en) * 1993-03-12 2003-10-02 Telebuyer, Llc Commercial product telephonic routing system with mobile wireless and video vending capability
US5717857A (en) * 1993-03-19 1998-02-10 Ncr Corporation System for switching data connection to use first channel and second channel without substantial interruption of transfer of audio signals and image data between computers
US5691713A (en) * 1994-01-18 1997-11-25 Fuji Xerox Co., Ltd. Communication apparatus allowing a receiver to recognize a generalized situation of a sender
JPH0779424A (en) * 1993-09-06 1995-03-20 Hitachi Ltd Multi-point video communication equipment
US7185054B1 (en) 1993-10-01 2007-02-27 Collaboration Properties, Inc. Participant display and selection in video conference calls
US6594688B2 (en) * 1993-10-01 2003-07-15 Collaboration Properties, Inc. Dedicated echo canceler for a workstation
US5689641A (en) 1993-10-01 1997-11-18 Vicor, Inc. Multimedia collaboration system arrangement for routing compressed AV signal through a participant site without decompressing the AV signal
US6898620B1 (en) 1996-06-07 2005-05-24 Collaboration Properties, Inc. Multiplexing video and control signals onto UTP
US6279029B1 (en) 1993-10-12 2001-08-21 Intel Corporation Server/client architecture and method for multicasting on a computer network
US5557724A (en) * 1993-10-12 1996-09-17 Intel Corporation User interface, method, and apparatus selecting and playing channels having video, audio, and/or text streams
US5809237A (en) * 1993-11-24 1998-09-15 Intel Corporation Registration of computer-based conferencing system
US5506954A (en) * 1993-11-24 1996-04-09 Intel Corporation PC-based conferencing system
US5673393A (en) * 1993-11-24 1997-09-30 Intel Corporation Managing bandwidth over a computer network having a management computer that allocates bandwidth to client computers upon request
US5600797A (en) * 1993-11-24 1997-02-04 Intel Corporation System for identifying new client and allocating bandwidth thereto by monitoring transmission of message received periodically from client computers informing of their current status
US5592547A (en) * 1993-11-24 1997-01-07 Intel Corporation Processing audio signals using a discrete state machine
US5574934A (en) * 1993-11-24 1996-11-12 Intel Corporation Preemptive priority-based transmission of signals using virtual channels
US5862388A (en) * 1993-11-24 1999-01-19 Intel Corporation Interrupt-time processing of received signals
US5524110A (en) * 1993-11-24 1996-06-04 Intel Corporation Conferencing over multiple transports
US5490247A (en) * 1993-11-24 1996-02-06 Intel Corporation Video subsystem for computer-based conferencing system
US5579389A (en) * 1993-11-24 1996-11-26 Intel Corporation Histogram-based processing of audio signals
US5631967A (en) * 1993-11-24 1997-05-20 Intel Corporation Processing audio signals using a state variable
US5754765A (en) * 1993-11-24 1998-05-19 Intel Corporation Automatic transport detection by attempting to establish communication session using list of possible transports and corresponding media dependent modules
US5566238A (en) * 1993-11-24 1996-10-15 Intel Corporation Distributed processing of audio signals
US5949891A (en) * 1993-11-24 1999-09-07 Intel Corporation Filtering audio signals from a combined microphone/speaker earpiece
US5602992A (en) * 1993-11-29 1997-02-11 Intel Corporation System for synchronizing data stream transferred from server to client by initializing clock when first packet is received and comparing packet time information with clock
US5555377A (en) * 1993-12-20 1996-09-10 International Business Machines Corporation System for selectively compressing data transferred in network in response to produced first output when network utilization exceeds first threshold and data length over limit
US5557538A (en) * 1994-05-18 1996-09-17 Zoran Microelectronics Ltd. MPEG decoder
US5572677A (en) * 1994-08-04 1996-11-05 Canon Information Systems, Inc. Method and apparatus for conversing over a network
US5802281A (en) * 1994-09-07 1998-09-01 Rsi Systems, Inc. Peripheral audio/video communication system that interfaces with a host computer and determines format of coded audio/video signals
US5821986A (en) * 1994-11-03 1998-10-13 Picturetel Corporation Method and apparatus for visual communications in a scalable network environment
JP3172643B2 (en) * 1994-11-14 2001-06-04 シャープ株式会社 Digital recording and reproducing apparatus
US5893037A (en) * 1994-12-09 1999-04-06 Eastman Kodak Company Combined electronic/silver-halide image capture system with cellular transmission capability
US6396816B1 (en) * 1994-12-20 2002-05-28 Intel Corporation Method and apparatus for multiple applications on a single ISDN line
US5600646A (en) * 1995-01-27 1997-02-04 Videoserver, Inc. Video teleconferencing system with digital transcoding
EP0724362B1 (en) * 1995-01-30 2000-03-22 International Business Machines Corporation Priority controlled transmission of multimedia streams via a telecommunication line
US5953350A (en) * 1995-03-13 1999-09-14 Selsius Systems, Inc. Multimedia client for multimedia/hybrid network
US6104754A (en) * 1995-03-15 2000-08-15 Kabushiki Kaisha Toshiba Moving picture coding and/or decoding systems, and variable-length coding and/or decoding system
JP3375245B2 (en) * 1995-04-26 2003-02-10 インターナショナル・ビジネス・マシーンズ・コーポレーション Apparatus for fault-tolerant multi-media program distribution
US5758094A (en) * 1995-05-24 1998-05-26 Winnov Computer video communications system
US6078349A (en) * 1995-06-07 2000-06-20 Compaq Computer Corporation Process and system for increasing the display resolution of a point-to-point video transmission relative to the actual amount of video data sent
US5689800A (en) * 1995-06-23 1997-11-18 Intel Corporation Video feedback for reducing data rate or increasing quality in a video processing system
US5752082A (en) * 1995-06-29 1998-05-12 Data Race System for multiplexing pins of a PC card socket and PC card bus adapter for providing audio communication between PC card and computer sound system
US5799036A (en) * 1995-06-29 1998-08-25 Staples; Leven E. Computer system which provides analog audio communication between a PC card and the computer's sound system
US9832244B2 (en) * 1995-07-14 2017-11-28 Arris Enterprises Llc Dynamic quality adjustment based on changing streaming constraints
WO1997016926A1 (en) * 1995-10-31 1997-05-09 Sarnoff Corporation Method and apparatus for determining ambient conditions from an image sequence
US5805153A (en) * 1995-11-28 1998-09-08 Sun Microsystems, Inc. Method and system for resizing the subtitles of a video
US5796724A (en) * 1995-12-28 1998-08-18 Intel Corporation Method and apparatus for partitioning transmission bandwidth among different data streams
US5784572A (en) * 1995-12-29 1998-07-21 Lsi Logic Corporation Method and apparatus for compressing video and voice signals according to different standards
US5796440A (en) * 1996-02-29 1998-08-18 Rupinski; Frederick A. Baseband video/audio/data transceiver
US6577714B1 (en) 1996-03-11 2003-06-10 At&T Corp. Map-based directory system
US6650998B1 (en) 1996-03-11 2003-11-18 At&T Corp. Information Search System for enabling a user of a user terminal to search a data source
US5784633A (en) * 1996-03-12 1998-07-21 International Business Machines Corporation System for obtaining status data unrelated to user data path from a modem and providing control data to the modem without interrupting user data flow
US5764235A (en) * 1996-03-25 1998-06-09 Insight Development Corporation Computer implemented method and system for transmitting graphical images from server to client at user selectable resolution
US5745700A (en) * 1996-05-13 1998-04-28 International Business Machines Corporation Multi media video matrix address decoder
US6678311B2 (en) * 1996-05-28 2004-01-13 Qualcomm Incorporated High data CDMA wireless communication system using variable sized channel codes
US5826031A (en) * 1996-06-10 1998-10-20 Sun Microsystems, Inc. Method and system for prioritized downloading of embedded web objects
US6324188B1 (en) * 1997-06-12 2001-11-27 Sharp Kabushiki Kaisha Voice and data multiplexing system and recording medium having a voice and data multiplexing program recorded thereon
US5784561A (en) * 1996-07-01 1998-07-21 At&T Corp. On-demand video conference method and apparatus
US6009151A (en) * 1996-08-27 1999-12-28 Data Race, Inc. PC card modem with microphone and speaker connectivity
US5768633A (en) * 1996-09-03 1998-06-16 Eastman Kodak Company Tradeshow photographic and data transmission system
US5877957A (en) 1996-11-06 1999-03-02 Ameritech Services, Inc. Method and system of programming at least one appliance to change state upon the occurrence of a trigger event
US5995490A (en) * 1996-12-13 1999-11-30 Siemens Information And Communication Networks, Inc. Method and system for integrating video and data transfers in a multimedia session
US6097435A (en) * 1997-01-31 2000-08-01 Hughes Electronics Corporation Video system with selectable bit rate reduction
US6084910A (en) * 1997-01-31 2000-07-04 Hughes Electronics Corporation Statistical multiplexer for video signals
US6005620A (en) * 1997-01-31 1999-12-21 Hughes Electronics Corporation Statistical multiplexer for live and pre-compressed video
US6188436B1 (en) 1997-01-31 2001-02-13 Hughes Electronics Corporation Video broadcast system with video data shifting
US6091455A (en) * 1997-01-31 2000-07-18 Hughes Electronics Corporation Statistical multiplexer for recording video
US6078958A (en) * 1997-01-31 2000-06-20 Hughes Electronics Corporation System for allocating available bandwidth of a concentrated media output
US6029127A (en) * 1997-03-28 2000-02-22 International Business Machines Corporation Method and apparatus for compressing audio signals
JPH10322673A (en) * 1997-05-15 1998-12-04 Canon Inc Communication equipment/method and storage medium
US6418203B1 (en) 1997-06-06 2002-07-09 Data Race, Inc. System and method for communicating audio information between a computer and a duplex speakerphone modem
US5838664A (en) 1997-07-17 1998-11-17 Videoserver, Inc. Video teleconferencing system with digital transcoding
US6348946B1 (en) 1997-08-14 2002-02-19 Lockheed Martin Corporation Video conferencing with video accumulator array VAM memory
US5999966A (en) * 1997-10-07 1999-12-07 Mcdougall; Floyd Control network-directed video conferencing switching system and method
US6816904B1 (en) * 1997-11-04 2004-11-09 Collaboration Properties, Inc. Networked video multimedia storage server environment
US5983263A (en) * 1998-01-02 1999-11-09 Intel Corporation Method and apparatus for transmitting images during a multimedia teleconference
US6208640B1 (en) 1998-02-27 2001-03-27 David Spell Predictive bandwidth allocation method and apparatus
US6373855B1 (en) * 1998-03-05 2002-04-16 Intel Corporation System and method for using audio performance to control video bandwidth
US6154769A (en) * 1998-03-27 2000-11-28 Hewlett-Packard Company Scheduling server requests to decrease response time and increase server throughput
US6697632B1 (en) * 1998-05-07 2004-02-24 Sharp Laboratories Of America, Inc. Multi-media coordinated delivery system and method
US6393056B1 (en) * 1998-07-01 2002-05-21 Texas Instruments Incorporated Compression of information from one detector as a function of information from another detector
US6317776B1 (en) 1998-12-17 2001-11-13 International Business Machines Corporation Method and apparatus for automatic chat room source selection based on filtered audio input amplitude of associated data streams
US6269483B1 (en) 1998-12-17 2001-07-31 International Business Machines Corp. Method and apparatus for using audio level to make a multimedia conference dormant
US6611872B1 (en) 1999-01-11 2003-08-26 Fastforward Networks, Inc. Performing multicast communication in computer networks by using overlay routing
US20010028662A1 (en) * 2000-01-18 2001-10-11 Hunt Paul M. Method and system of real-time optimization and implementation of content and advertising programming decisions for broadcasts and narrowcasts
US20020054205A1 (en) * 2000-02-22 2002-05-09 Magnuski Henry S. Videoconferencing terminal
US7280492B2 (en) * 2000-02-22 2007-10-09 Ncast Corporation Videoconferencing system
US7284064B1 (en) * 2000-03-21 2007-10-16 Intel Corporation Method and apparatus to determine broadcast content and scheduling in a broadcast system
US7275254B1 (en) 2000-11-21 2007-09-25 Intel Corporation Method and apparatus for determining and displaying the service level of a digital television broadcast signal
US6714761B1 (en) 2000-11-21 2004-03-30 Starcom Wireless, Inc. Meteor burst communication system having the capability of simultaneous communication with multiple remote units
US20020144265A1 (en) * 2001-03-29 2002-10-03 Connelly Jay H. System and method for merging streaming and stored content information in an electronic program guide
US20020144269A1 (en) * 2001-03-30 2002-10-03 Connelly Jay H. Apparatus and method for a dynamic electronic program guide enabling billing broadcast services per EPG line item
US20020143591A1 (en) * 2001-03-30 2002-10-03 Connelly Jay H. Method and apparatus for a hybrid content on demand broadcast system
US7185352B2 (en) * 2001-05-11 2007-02-27 Intel Corporation Method and apparatus for combining broadcast schedules and content on a digital broadcast-enabled client platform
US20030005465A1 (en) * 2001-06-15 2003-01-02 Connelly Jay H. Method and apparatus to send feedback from clients to a server in a content distribution broadcast system
US20030005451A1 (en) * 2001-06-15 2003-01-02 Connelly Jay H. Method and apparatus to distribute content descriptors in a content distribution broadcast system
US20020194603A1 (en) * 2001-06-15 2002-12-19 Jay H. Connelly Method and apparatus to distribute content using a multi-stage broadcast system
US8296441B2 (en) 2005-01-14 2012-10-23 Citrix Systems, Inc. Methods and systems for joining a real-time session of presentation layer protocol data
US8935316B2 (en) 2005-01-14 2015-01-13 Citrix Systems, Inc. Methods and systems for in-session playback on a local machine of remotely-stored and real time presentation layer protocol data
US20030005385A1 (en) * 2001-06-27 2003-01-02 Stieger Ronald D. Optical communication system with variable error correction coding
US7328455B2 (en) * 2001-06-28 2008-02-05 Intel Corporation Apparatus and method for enabling secure content decryption within a set-top box
US20030046633A1 (en) * 2001-08-28 2003-03-06 Jutzi Curtis E. Data error correction based on reported factors and predicted data interference factors
US7047456B2 (en) * 2001-08-28 2006-05-16 Intel Corporation Error correction for regional and dynamic factors in communications
US20030061611A1 (en) * 2001-09-26 2003-03-27 Ramesh Pendakur Notifying users of available content and content reception based on user profiles
US8943540B2 (en) * 2001-09-28 2015-01-27 Intel Corporation Method and apparatus to provide a personalized channel
US20030066090A1 (en) * 2001-09-28 2003-04-03 Brendan Traw Method and apparatus to provide a personalized channel
US20030135605A1 (en) * 2002-01-11 2003-07-17 Ramesh Pendakur User rating feedback loop to modify virtual channel content and/or schedules
US20030135857A1 (en) * 2002-01-11 2003-07-17 Ramesh Pendakur Content discovery in a digital broadcast data service
ES2197794B1 (en) * 2002-01-18 2005-03-16 Diseño De Sistemas En Silicio, S.A Data transmission method for a multiuser system of digital transmission of data point to multipoint.
JP4458041B2 (en) * 2002-07-30 2010-04-28 ソニー株式会社 Program, an information processing method and apparatus, and data structure
US8238241B2 (en) * 2003-07-29 2012-08-07 Citrix Systems, Inc. Automatic detection and window virtualization for flow control
US7542471B2 (en) * 2002-10-30 2009-06-02 Citrix Systems, Inc. Method of determining path maximum transmission unit
US7656799B2 (en) * 2003-07-29 2010-02-02 Citrix Systems, Inc. Flow control system architecture
US8233392B2 (en) 2003-07-29 2012-07-31 Citrix Systems, Inc. Transaction boundary detection for reduction in timeout penalties
US8432800B2 (en) * 2003-07-29 2013-04-30 Citrix Systems, Inc. Systems and methods for stochastic-based quality of service
US7630305B2 (en) * 2003-07-29 2009-12-08 Orbital Data Corporation TCP selective acknowledgements for communicating delivered and missed data packets
US8270423B2 (en) * 2003-07-29 2012-09-18 Citrix Systems, Inc. Systems and methods of using packet boundaries for reduction in timeout prevention
US8437284B2 (en) 2003-07-29 2013-05-07 Citrix Systems, Inc. Systems and methods for additional retransmissions of dropped packets
US7616638B2 (en) 2003-07-29 2009-11-10 Orbital Data Corporation Wavefront detection and disambiguation of acknowledgments
US6947409B2 (en) * 2003-03-17 2005-09-20 Sony Corporation Bandwidth management of virtual networks on a shared network
US8200828B2 (en) * 2005-01-14 2012-06-12 Citrix Systems, Inc. Systems and methods for single stack shadowing
US7565614B2 (en) * 2003-05-02 2009-07-21 Sony Corporation Image data processing apparatus and image data processing method for a video conference system
US7092693B2 (en) 2003-08-29 2006-08-15 Sony Corporation Ultra-wide band wireless / power-line communication system for delivering audio/video content
US7599002B2 (en) * 2003-12-02 2009-10-06 Logitech Europe S.A. Network camera mounting system
US20050120128A1 (en) * 2003-12-02 2005-06-02 Wilife, Inc. Method and system of bandwidth management for streaming data
US7680885B2 (en) * 2004-04-15 2010-03-16 Citrix Systems, Inc. Methods and apparatus for synchronization of data set representations in a bandwidth-adaptive manner
US20060002315A1 (en) * 2004-04-15 2006-01-05 Citrix Systems, Inc. Selectively sharing screen data
US20060031779A1 (en) * 2004-04-15 2006-02-09 Citrix Systems, Inc. Selectively sharing screen data
US7827139B2 (en) * 2004-04-15 2010-11-02 Citrix Systems, Inc. Methods and apparatus for sharing graphical screen data in a bandwidth-adaptive manner
JP2006033356A (en) * 2004-07-15 2006-02-02 Renesas Technology Corp Audio data processing apparatus
US8149739B2 (en) * 2004-10-15 2012-04-03 Lifesize Communications, Inc. Background call validation
US20060106929A1 (en) * 2004-10-15 2006-05-18 Kenoyer Michael L Network conference communications
US7545435B2 (en) * 2004-10-15 2009-06-09 Lifesize Communications, Inc. Automatic backlight compensation and exposure control
US20060171453A1 (en) * 2005-01-04 2006-08-03 Rohlfing Thomas R Video surveillance system
US20060159432A1 (en) * 2005-01-14 2006-07-20 Citrix Systems, Inc. System and methods for automatic time-warped playback in rendering a recorded computer session
US8230096B2 (en) * 2005-01-14 2012-07-24 Citrix Systems, Inc. Methods and systems for generating playback instructions for playback of a recorded computer session
US8340130B2 (en) * 2005-01-14 2012-12-25 Citrix Systems, Inc. Methods and systems for generating playback instructions for rendering of a recorded computer session
US8145777B2 (en) 2005-01-14 2012-03-27 Citrix Systems, Inc. Method and system for real-time seeking during playback of remote presentation protocols
US7831728B2 (en) 2005-01-14 2010-11-09 Citrix Systems, Inc. Methods and systems for real-time seeking during real-time playback of a presentation layer protocol data stream
US20060255931A1 (en) * 2005-05-12 2006-11-16 Hartsfield Andrew J Modular design for a security system
US8443040B2 (en) * 2005-05-26 2013-05-14 Citrix Systems Inc. Method and system for synchronizing presentation of a dynamic data set to a plurality of nodes
US8191008B2 (en) 2005-10-03 2012-05-29 Citrix Systems, Inc. Simulating multi-monitor functionality in a single monitor environment
US20070115388A1 (en) * 2005-10-12 2007-05-24 First Data Corporation Management of video transmission over networks
US20070081522A1 (en) * 2005-10-12 2007-04-12 First Data Corporation Video conferencing systems and methods
US7378540B2 (en) * 2005-10-21 2008-05-27 Catalytic Distillation Technologies Process for producing organic carbonates
US7958355B2 (en) * 2006-03-01 2011-06-07 Microsoft Corporation Keytote component
US20070250878A1 (en) * 2006-04-05 2007-10-25 Ryckman Lawrence G Interactive system for conducting contest
US7996495B2 (en) 2006-04-06 2011-08-09 General Electric Company Adaptive selection of image streaming mode
JP2007329752A (en) * 2006-06-08 2007-12-20 Matsushita Electric Ind Co Ltd Image processor and image processing method
US8078972B2 (en) * 2006-09-15 2011-12-13 Citrix Systems, Inc. Methods and interfaces for displaying performance data related to a current remote access session
US7978617B2 (en) * 2006-09-15 2011-07-12 Citrix Systems, Inc. Methods for providing performance improvement recommendations
US8488839B2 (en) * 2006-11-20 2013-07-16 Videosurf, Inc. Computer program and apparatus for motion-based object extraction and tracking in video
US8379915B2 (en) * 2006-11-20 2013-02-19 Videosurf, Inc. Method of performing motion-based object extraction and tracking in video
US8059915B2 (en) * 2006-11-20 2011-11-15 Videosurf, Inc. Apparatus for and method of robust motion estimation using line averages
US7706266B2 (en) * 2007-03-12 2010-04-27 Citrix Systems, Inc. Systems and methods of providing proxy-based quality of service
US20080276029A1 (en) * 2007-05-03 2008-11-06 Haraden Ryan S Method and System for Fast Flow Control
US7903899B2 (en) * 2007-05-23 2011-03-08 Videosurf, Inc. Method of geometric coarsening and segmenting of still images
US7920748B2 (en) * 2007-05-23 2011-04-05 Videosurf, Inc. Apparatus and software for geometric coarsening and segmenting of still images
CN101355683B (en) 2007-07-23 2011-07-13 中兴通讯股份有限公司 Method for automatically mapping image of wideband private wire conference television system
KR101425668B1 (en) * 2007-07-26 2014-08-04 페어차일드코리아반도체 주식회사 Frequency modulation device and switch mode power supply using the same
US9661267B2 (en) * 2007-09-20 2017-05-23 Lifesize, Inc. Videoconferencing system discovery
WO2010006334A1 (en) 2008-07-11 2010-01-14 Videosurf, Inc. Apparatus and software system for and method of performing a visual-relevance-rank subsequent search
US8364660B2 (en) * 2008-07-11 2013-01-29 Videosurf, Inc. Apparatus and software system for and method of performing a visual-relevance-rank subsequent search
JP5309765B2 (en) * 2008-07-29 2013-10-09 富士通株式会社 Information Access System, the information storage device, and a read-write device
WO2010042580A1 (en) * 2008-10-08 2010-04-15 Citrix Systems, Inc. Systems and methods for allocating bandwidth by an intermediary for flow control
US8305421B2 (en) * 2009-06-29 2012-11-06 Lifesize Communications, Inc. Automatic determination of a configuration for a conference
US20110082691A1 (en) * 2009-10-05 2011-04-07 Electronics And Telecommunications Research Institute Broadcasting system interworking with electronic devices
US8130790B2 (en) * 2010-02-08 2012-03-06 Apple Inc. Digital communications system with variable-bandwidth traffic channels
US9508011B2 (en) 2010-05-10 2016-11-29 Videosurf, Inc. Video visual and audio query
KR101683291B1 (en) * 2010-05-14 2016-12-06 엘지전자 주식회사 Display apparatus and control method thereof
US9079494B2 (en) 2010-07-01 2015-07-14 Mill Mountain Capital, LLC Systems, devices and methods for vehicles
US8533166B1 (en) * 2010-08-20 2013-09-10 Brevity Ventures LLC Methods and systems for encoding/decoding files and transmission thereof
US9191686B2 (en) 2011-07-22 2015-11-17 Honeywell International Inc. System and method of implementing synchronized audio and video streaming
US8615159B2 (en) 2011-09-20 2013-12-24 Citrix Systems, Inc. Methods and systems for cataloging text in a recorded session
US9118940B2 (en) 2012-07-30 2015-08-25 Google Technology Holdings LLC Video bandwidth allocation in a video conference
US9406305B2 (en) * 2012-12-21 2016-08-02 Digimarc Corpororation Messaging by writing an image into a spectrogram
US9035992B1 (en) * 2013-04-08 2015-05-19 Google Inc. Bandwidth modulation system and method
US9027147B2 (en) * 2013-05-13 2015-05-05 Hewlett-Packard Development Company, L.P. Verification of serialization codes
CN104219479B (en) * 2013-05-30 2017-11-03 中国电信股份有限公司 Video communication service processing method and system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0119588A2 (en) * 1983-03-16 1984-09-26 International Standard Electric Corporation Integrated communication system and method of establishing videotelephone links in the same
US4494144A (en) * 1982-06-28 1985-01-15 At&T Bell Laboratories Reduced bandwidth video transmission
US4541008A (en) * 1982-12-27 1985-09-10 Jones Futura Foundation, Ltd. Television signal bandwidth reduction using variable rate transmission
US4682225A (en) * 1985-09-13 1987-07-21 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Method and apparatus for telemetry adaptive bandwidth compression

Family Cites Families (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE119588C (en) *
US3873771A (en) * 1972-04-11 1975-03-25 Telescan Communications System Simultaneous transmission of a video and an audio signal through an ordinary telephone transmission line
US3795763A (en) * 1972-04-18 1974-03-05 Communications Satellite Corp Digital television transmission system
US4394774A (en) * 1978-12-15 1983-07-19 Compression Labs, Inc. Digital video compression system and methods utilizing scene adaptive coding with rate buffer feedback
US4425642A (en) * 1982-01-08 1984-01-10 Applied Spectrum Technologies, Inc. Simultaneous transmission of two information signals within a band-limited communications channel
JPH0254714B2 (en) * 1982-12-28 1990-11-22 Pioneer Electronic Corp
US4720745A (en) * 1983-06-22 1988-01-19 Digivision, Inc. Method and apparatus for enhancing video displays
US4654484A (en) * 1983-07-21 1987-03-31 Interand Corporation Video compression/expansion system
US4736407A (en) * 1986-04-08 1988-04-05 The United States Of America As Represented By The Secretary Of The Army Computer assisted graphic teleconferencing method and apparatus
US4739413A (en) * 1985-06-14 1988-04-19 Luma Telecom, Inc. Video-optimized modulator-demodulator with adjacent modulating amplitudes matched to adjacent pixel gray values
US4792993A (en) * 1985-10-30 1988-12-20 Capetronic (Bsr) Ltd. TVRD receiver system with automatic bandwidth adjustment
WO1987004033A1 (en) * 1985-12-24 1987-07-02 British Broadcasting Corporation Method of coding a video signal for transmission in a restricted bandwidth
US4858026A (en) * 1986-04-14 1989-08-15 U.S. Philips Corporation Image display
US4797750A (en) * 1986-04-16 1989-01-10 John Hopkins University Method and apparatus for transmitting/recording computer-generated displays on an information channel having only audio bandwidth
DE3786946T2 (en) * 1986-04-30 1994-01-27 Sharp Kk Method and system for multiplex transmission of an audio signal and a video signal via a communication cable.
JPS63232589A (en) * 1987-03-19 1988-09-28 Fujitsu Ltd System for making band in sound-picture transmitting frame variable
US4733299A (en) * 1987-03-26 1988-03-22 New York Institute Of Technology Method and apparatus for generating progressively scanned television information
US4780761A (en) * 1987-06-02 1988-10-25 Eastman Kodak Company Digital image compression and transmission system visually weighted transform coefficients
US4825434A (en) * 1987-09-10 1989-04-25 Gte Laboratories Incorporated Variable bandwidth control system
US5136575A (en) * 1987-12-16 1992-08-04 Kabushiki Kaisha Myukomu Cancelling circuit and transmission system
DE3850952T2 (en) * 1987-12-22 1995-02-23 Philips Nv Video signal coding and decoding with an adaptive filter.
US4849811A (en) * 1988-07-06 1989-07-18 Ben Kleinerman Simultaneous audio and video transmission with restricted bandwidth
DE3823219C1 (en) * 1988-07-08 1989-05-18 Telenorma Telefonbau Und Normalzeit Gmbh, 6000 Frankfurt, De
JP3002471B2 (en) * 1988-08-19 2000-01-24 株式会社日立製作所 Program distribution device
US5157491A (en) * 1988-10-17 1992-10-20 Kassatly L Samuel A Method and apparatus for video broadcasting and teleconferencing
JPH02246431A (en) * 1989-03-18 1990-10-02 Fujitsu Ltd Video audio multiplex system
US4999831A (en) * 1989-10-19 1991-03-12 United Telecommunications, Inc. Synchronous quantized subcarrier multiplexer for digital transport of video, voice and data
US4949169A (en) * 1989-10-27 1990-08-14 International Business Machines Corporation Audio-video data interface for a high speed communication link in a video-graphics display window environment
US5164980A (en) * 1990-02-21 1992-11-17 Alkanox Corporation Video telephone system
US5072442A (en) * 1990-02-28 1991-12-10 Harris Corporation Multiple clock rate teleconferencing network
JPH04104683A (en) * 1990-08-24 1992-04-07 Nec Eng Ltd High-efficiency image compressing and encoding device
JPH04139988A (en) * 1990-09-29 1992-05-13 Fuji Xerox Co Ltd Electronic conference system
JPH04139932A (en) * 1990-09-29 1992-05-13 Fuji Xerox Co Ltd Information display method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4494144A (en) * 1982-06-28 1985-01-15 At&T Bell Laboratories Reduced bandwidth video transmission
US4541008A (en) * 1982-12-27 1985-09-10 Jones Futura Foundation, Ltd. Television signal bandwidth reduction using variable rate transmission
EP0119588A2 (en) * 1983-03-16 1984-09-26 International Standard Electric Corporation Integrated communication system and method of establishing videotelephone links in the same
US4682225A (en) * 1985-09-13 1987-07-21 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Method and apparatus for telemetry adaptive bandwidth compression

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
TELECOMMUNICATIONS vol. 25, no. 6, June 1991, USA pages 37 - 46 M. GRIMSHAW 'LAN INTERCONNENTIONS TECHNOLOGY' *
WEISS C.: "DESK TOP VIDEO CONFERENCING - AN IMPORTANT FEATURE OF FUTURE VISUAL COMMUNICATIONS.", INTERNATIONAL CONFERENCE ON COMMUNICATIONS. INCLUDING SUPERCOMM TECHNICAL SESSIONS. ATLANTA, APR. 15 - 19, 1990., NEW YORK, IEEE., US, vol. 01 OF 04., 16 April 1990 (1990-04-16), US, pages 134 - 139., XP000147391 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1998056177A1 (en) * 1997-06-02 1998-12-10 Northern Telecom Limited Dynamic selection of media streams for display
US6128649A (en) * 1997-06-02 2000-10-03 Nortel Networks Limited Dynamic selection of media streams for display
FR2794326A1 (en) * 1999-05-31 2000-12-01 Canon Europa Nv Method of updating processing peripherals on communication network

Also Published As

Publication number Publication date Type
US5392223A (en) 1995-02-21 grant
DE69323357T2 (en) 1999-09-23 grant
EP0581101B1 (en) 1999-02-03 grant
CA2096160C (en) 1997-01-21 grant
CA2096160A1 (en) 1994-01-30 application
JP3061981B2 (en) 2000-07-10 grant
DE69323357D1 (en) 1999-03-18 grant
JPH06225266A (en) 1994-08-12 application

Similar Documents

Publication Publication Date Title
Cherry et al. An experimental study of the possible bandwidth compression of visual image signals
US7006155B1 (en) Real time programmable chroma keying with shadow generation
US5673253A (en) Dynamic allocation of telecommunications resources
Mitchell et al. MPEG video compression standard
US5864415A (en) Fiber optic network with wavelength-division-multiplexed transmission to customer premises
US5751338A (en) Methods and systems for multimedia communications via public telephone networks
US5710829A (en) System and method for focused-based image segmentation for video signals
US5559900A (en) Compression of signals for perceptual quality by selecting frequency bands having relatively high energy
US5729535A (en) Method and apparatus for adapting a computer for wireless communications
US6020915A (en) Method and system for providing an analog voice-only endpoint with pseudo multimedia service
US5136581A (en) Arrangement for reserving and allocating a plurality of competing demands for an ordered bus communication network
US5819043A (en) Multimedia resource reservation system
US5638363A (en) Switched telecommunications network with bandwidth management for different types of multiplexed traffic
US6757005B1 (en) Method and system for multimedia video processing
US5323187A (en) Image compression system by setting fixed bit rates
US6205154B1 (en) Automatic path selection for fiber-optic transmission networks
US6304606B1 (en) Image data coding and restoring method and apparatus for coding and restoring the same
US4748618A (en) Telecommunications interface
US6721282B2 (en) Telecommunication data compression apparatus and method
US5701465A (en) Method and apparatus for reserving system resources to assure quality of service
US5841763A (en) Audio-video conferencing system
US6335755B1 (en) Methods and apparatus for the creation and transmission of 3-dimensional images
US5227875A (en) System for transmitting encoded image data with quick image expansion and contraction
US6188428B1 (en) Transcoding video file server and methods for its use
US5526350A (en) Communication network with bandwidth managers for allocating bandwidth to different types of traffic

Legal Events

Date Code Title Description
AK Designated contracting states:

Kind code of ref document: A1

Designated state(s): DE FR GB

17P Request for examination filed

Effective date: 19940519

17Q First examination report

Effective date: 19960510

AK Designated contracting states:

Kind code of ref document: B1

Designated state(s): DE FR GB

REF Corresponds to:

Ref document number: 69323357

Country of ref document: DE

Date of ref document: 19990318

ET Fr: translation filed
26N No opposition filed
REG Reference to a national code

Ref country code: GB

Ref legal event code: IF02

PGFP Postgrant: annual fees paid to national office

Ref country code: FR

Payment date: 20040721

Year of fee payment: 12

PG25 Lapsed in a contracting state announced via postgrant inform. from nat. office to epo

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20060331

REG Reference to a national code

Ref country code: FR

Ref legal event code: ST

Effective date: 20060331

REG Reference to a national code

Ref country code: GB

Ref legal event code: 732E

Free format text: REGISTERED BETWEEN 20111103 AND 20111109

REG Reference to a national code

Ref country code: DE

Ref legal event code: R082

Ref document number: 69323357

Country of ref document: DE

Representative=s name: BOSCH JEHLE PATENTANWALTSGESELLSCHAFT MBH, DE

REG Reference to a national code

Ref country code: DE

Ref legal event code: R082

Ref document number: 69323357

Country of ref document: DE

Representative=s name: BOSCH JEHLE PATENTANWALTSGESELLSCHAFT MBH, DE

Effective date: 20120113

Ref country code: DE

Ref legal event code: R081

Ref document number: 69323357

Country of ref document: DE

Owner name: CISCO TECHNOLOGY, INC., US

Free format text: FORMER OWNER: INTERNATIONAL BUSINESS MACHINES CORP., ARMONK, US

Effective date: 20120113

PGFP Postgrant: annual fees paid to national office

Ref country code: GB

Payment date: 20120725

Year of fee payment: 20

PGFP Postgrant: annual fees paid to national office

Ref country code: DE

Payment date: 20120727

Year of fee payment: 20

REG Reference to a national code

Ref country code: DE

Ref legal event code: R071

Ref document number: 69323357

Country of ref document: DE

REG Reference to a national code

Ref country code: GB

Ref legal event code: PE20

Expiry date: 20130711

PG25 Lapsed in a contracting state announced via postgrant inform. from nat. office to epo

Ref country code: DE

Free format text: LAPSE BECAUSE OF EXPIRATION OF PROTECTION

Effective date: 20130713

PG25 Lapsed in a contracting state announced via postgrant inform. from nat. office to epo

Ref country code: GB

Free format text: LAPSE BECAUSE OF EXPIRATION OF PROTECTION

Effective date: 20130711