WO1998049835A1 - Interface video et audio pour ordinateur - Google Patents

Interface video et audio pour ordinateur Download PDF

Info

Publication number
WO1998049835A1
WO1998049835A1 PCT/US1998/008916 US9808916W WO9849835A1 WO 1998049835 A1 WO1998049835 A1 WO 1998049835A1 US 9808916 W US9808916 W US 9808916W WO 9849835 A1 WO9849835 A1 WO 9849835A1
Authority
WO
WIPO (PCT)
Prior art keywords
remote device
video data
die
interface
host device
Prior art date
Application number
PCT/US1998/008916
Other languages
English (en)
Inventor
Junien Labrousse
Olivier G. Garbe
Patrick J. Mccarthy
Harry L. Graham
Original Assignee
Winnov, L., P.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Winnov, L., P. filed Critical Winnov, L., P.
Publication of WO1998049835A1 publication Critical patent/WO1998049835A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • H04N7/148Interfacing a video terminal to a particular transmission medium, e.g. ISDN
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • H04N23/661Transmitting camera control signals through networks, e.g. control via the Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems

Definitions

  • This invention relates generally to the field of computer interfaces. More specifically, the invention relates to video and audio interfaces for a personal computer.
  • Multimedia application programs are becoming increasingly popular as their potential uses become known to the general public. This is especially true of video conferencing applications for business. The potential savings over actually meeting face to face are enormous. Two keys to expanding this market are ease of use and low cost.
  • An ordinary audio-video system used for video conferencing contains three basic elements.
  • a remote device This contains, among other things, the camera (usually a charge coupled device (CCD)).
  • CCD charge coupled device
  • a host device This is the computer which contains, among other things, a central processing unit (CPU) that runs the video conferencing software and provides the video display, the audio playback, and the control functions.
  • CPU central processing unit
  • an interface This contains, among other things, an interface cable of some finite length that contains a number of wires for conveying signals between the remote device and the host device.
  • Ordinary video conferencing equipment suffers from a number of drawbacks.
  • ordinary systems provide no control of the flow of data between the remote device and the host device. Data can only flow from the remote device to the host device. Video and audio data are transmitted on separate wires.
  • the remote device is constantly capturing video or audio data and transmitting it to the host device in real time. This presents the host device with the responsibility of storing all of this data or risk losing some of it. In many applications, some of this data is not needed but is transmitted anyway. Also, the large quantity of data being transmitted requires a large number of wires on which to carry it which increases the chance of generating interference problems between the wires.
  • the interface cable between the remote device and the host device is cumbersome because of the large number of wires which it contains.
  • the interface cable consists of a sixteen bit parallel bus in addition to other control lines. This can add up to an interface cable that contains as many as twenty six wires. This makes the interface cable rigid and heavy.
  • the host device contains a dedicated video card which contains a large number of circuit elements.
  • this dedicated video card takes up a valuable expansion slot in the host device which could be put to other uses.
  • the dedicated video card is not usually installed by the computer manufacturer, someone else has to install it. This can present a significant obstacle to some computer owners who do not feel confident in their ability to successfully install the dedicated video card themselves.
  • Yet a further object and advantage of the present invention is to provide a host device that is simple for the average user to manage.
  • An advanced video and audio interface for a computer comprises an interface device and an interface cable.
  • a system employing the interface comprises a host device, a remote device, and the interface cable disposed between the host device and the remote device so as to communicate data from the remote device to the host device and instructions from the host device to the remote device.
  • the remote device is capable of transmitting data over the interface to the host device and comprises the interface device, a video data scaling device, a video data compression device, an audio data analog to digital converter, and an audio data compression device.
  • the host device is capable of transmitting data over the interface to the remote device and of selectively configuring each of the video data scaling device and the video data compression device over the interface.
  • FIG. 1 is a block diagram of an audio- video system according to a presently preferred embodiment of the present invention.
  • FIG. 2 is a block diagram of two different types of video remote devices according to a presently preferred embodiment of the present invention.
  • FIG. 3 is a graphical representation of a frame of video data.
  • FIG. 4 is a block diagram of an audio remote device according to a presently preferred embodiment of the present invention.
  • FIG. 5 is a block diagram of the interface cable between the host device and the remote device detailing the signals that are carried over the interface cable according to a presently preferred embodiment of the present invention.
  • FIG. 6 is a graphical representation of the read word and read stream protocols according to a presently preferred embodiment of the present invention.
  • FIG. 7 is a graphical representation of the write word and write stream protocols according to a presently preferred embodiment of the present invention.
  • FIG. 1 a block diagram of an audio- video system 10 is shown.
  • the system 10 consists of a host device 12, a remote device 14, and an interface cable 16.
  • the general function of the remote device 14 is to provide audio and video data to the host device 12.
  • the remote device 14 would be a video camera and microphone.
  • the remote device 14 could be a VCR, a laser disc player, or any other of a number of possible video data sources.
  • the host device 12 manipulates this audio and video data from the remote device 14 to provide the desired result.
  • the host device 12 conveys the audio and video data via a modem connection to a second party with whom the conference is being held.
  • the video data may be displayed locally for preview purposes.
  • the host device 12 might simply display the video data, record the audio and video data, or manipulate the audio and video data for any of a number of other possible applications.
  • FIG. 1 shows that the remote device 14 comprises a video remote device 18, an audio remote device 20, an accessory device 22, and an interface device 24.
  • the video remote device 18 is connected to the interface device 24 via line 25.
  • the audio remote device 20 is connected to the interface device 24 via line 26.
  • the accessory device 22 is connected to the interface device 24 via line 27.
  • the video remote device 18 will be further explained with respect to FIGs. 2 and
  • the audio remote device 20 will be further explained with respect to FIG.
  • the remote device 14 comprises both the audio remote device 20 and the video remote devices 18, those of ordinary skill in the art will realize that that will not necessarily be the case. One or the other device may be eliminated depending on the application.
  • the accessory device 22 represents any of a number of devices that might be located in the remote device to augment the video remote device 18 or the audio remote device 20.
  • the existence of an accessory device 22 at all is optional and depends on the context. For example, in a security camera context, the security camera might want to have the ability to rotate from side to side. The mechanism that would enable the security camera to rotate would be one form of accessory device 22.
  • Other contexts include lens control, robotic mobility, LCD displays for playback, or any of a number of other possible accessories.
  • FIG. 2 a block diagram of two different types of video remote device 18 are shown in further detail.
  • the video portion of the remote device 14 is shown as being either an analog video remote device 18a or a digital video remote device 18b.
  • the analog video remote device 18a starts with an analog video source 30 which generates an analog video data signal.
  • the analog video data signal is transmitted over line 32 to a video data analog to digital (A/D) converter 34 which samples the analog video data signal and converts it to a digital signal.
  • A/D video data analog to digital
  • the digital video remote device 18b starts with a digital video source 36 which generates a digital video data signal.
  • This device is well known to those of ordinary skill in the art and needs not to be described further herein.
  • the signals from the video data A/D converter 34 of the analog video remote device 18a and from the digital video source 36 of the digital video remote device 18b are now both digital and receive similar processing in either the analog video remote device 18a or the digital video remote device 18b from this point forward.
  • these signals will be referred to commonly as a raw digital video data signal and will be discussed together except as noted.
  • the raw digital video data signal is transmitted over line 38 to a digital preprocessor 40.
  • the raw digital video data signal is transmitted over line 32 to a digital preprocessor 40.
  • the digital preprocessor 40 then conditions the raw digital video data for further processing.
  • the digital preprocessor 40 provides gamma, contrast, brightness, hue, and saturation correction, each of which is well known to those of ordinary skill in the art and need not be described further herein.
  • the preprocessed digital video data signal resulting from the digital preprocessor 40 is transmitted over line 42 to a video data scaling device 44.
  • the video data scaling device 44 then scales the raw digital video data signal to the appropriate level for the application that the host device 12 of FIG. 1 is set to perform. For example, if the host device 12 is set to display the video image in a window on a display monitor, then the video data scaling device 44 will be configured to reduce the total volume of information from the raw digital video data signal because it will not all be needed to provide the image resolution required by the display monitor. In this way, a higher refresh rate may be achieved at the cost of unneeded bandwidth.
  • the video data scaling device 44 achieves this result through the application of a scaling program to the preprocessed digital video data on a frame by frame basis.
  • the scaling program can be preloaded by the manufacturer employing variables for each application or, in a preferred embodiment, a micro-coded engine can be loaded from the host device 12 of FIG. 1. The latter allows the configuration of the scaling program to be optimized from application to application providing greater flexibility and speed.
  • micro- coded engines as dedicated programs that are run by integrated circuits that are detached from the CPU. Examples of scaling of the preprocessed digital video data will be presented with respect to FIG. 3 below.
  • the video data scaling device 44 may also perform some buffering functions that are well known to those of ordinary skill in the art and need not be described further herein.
  • FIG. 3 a graphical representation of a source frame 50 of video data is shown.
  • the source frame 50 is made up of a number of source lines 52 of data and each source line is made up of a number of source pixels 54 of data.
  • These source lines 52 and source pixels 54 represent the resolution of the video image contained in the source frame 50 of video data.
  • the video data will comprise a stream of source frames which will each have a format identical to the source frame 50 shown.
  • the number of source lines 52, source pixels 54, and source frames 50 will be determined by the analog video source 30 of FIG. 2 and the video data A/D converter 34 of FIG. 2 or the digital video source 36 of FIG. 2 depending on which type of video remote device, 18a or 18b respectively, is in use.
  • each source frame 50 For most applications, the resolution of each source frame will be greater than necessary. Hence, it would be inefficient to transmit all of the data from the remote device 14 of FIG. 1 to the host device 12 of FIG. 1 rather than just the data that is needed. There are a number of ways to reduce the inefficiency, however each presents a risk of creating errors in the data. For description purposes, assume that the video data for each source frame 50 is transmitted beginning with the pixel in the upper left hand corner and proceeds from left to right and from top to bottom. Those of ordinary skill in the art will realize that any number of other reference points could be chosen instead of the upper left hand corner and would still be considered within the scope of the invention.
  • spatial scaling is accomplished in the scaling device 44 of FIG. 2.
  • a typical source frame 50 may have a resolution of 1024 source pixels 54 by 480 source lines 52.
  • a typical video monitor upon which such video data is to be displayed may only have a resolution of 640 pixels by 480 lines. The problem is that there are far more source pixels per line then can be displayed. The solution is for only a ratio of the source data to be displayed.
  • the vertical ratio is one (480/480) while the horizontal ratio is 0.625 (640/1024) or roughly two thirds. So, one third of the source pixels need to be eliminated somehow. There are any number of ways to accomplish this. The easiest way is simply to ignore every third pixel of data, however this can introduce very large errors. A more precise way is to average three source pixels into two display pixels.
  • the micro- coded engine would include the values for four variables that govern the scaling program for this particular example.
  • the four variables would be horizontal ratio, number of source pixels, vertical ratio, and number of source lines. In the latter example above, the horizontal ratio is 0.625, the number of source pixels is three, the vertical ratio is one, and the number of source lines is one.
  • FIG. 60 Another form of spatial scaling is to change the image size without changing the image density.
  • Such an example is graphically represented as display frame 60.
  • the shape of the display frame 60 could vary and is only shown as rectangular for ease of description.
  • the display frame 60 is made up of a number of display lines 62 of data and each display line is made up of a number of display pixels 64 of data.
  • the problem is that entire source lines of data are not needed along with a portion of the source pixels from the source lines that contain the desired display data.
  • the solution is for the implementation of offsets to eliminate the undesired source data.
  • a vertical offset 66 of a desired number of source lines and a horizontal offset 68 of a desired number of source pixels are shown.
  • the pixels within these offsets need to be eliminated somehow. There are any number of ways to accomplish this. The easiest way is simply to ignore every source pixel of data within these offsets.
  • the host device 12 of FIG. 1 loads a micro-coded engine into the scaling device 44 of FIG. 2.
  • the micro-coded engine would include the values for four variables that govern the scaling program for this particular example.
  • the four variables would be horizontal offset, number of display pixels, vertical offset, and number of display lines.
  • the scaled digital video data signal resulting from the video data scaling device 44 is transmitted over line 46 to a video data compression device 48.
  • compression of the scaled digital video data signal may not be necessary and may be ignored. In such applications it may be cheaper to also eliminate the video data compression device 48 altogether.
  • the video data is made up of a series of source frames 50 of FIG. 3.
  • the number of source frames i.e. the frame rate
  • the operation that the video data compression device 48 may perform to reduce the frame rate is that of temporal transformation.
  • NTSC television standards set a maximum frame rate of 30 frames per second while a typical video source can supply up to 60 frames per second.
  • the problem is that there are more source frames of data than can be displayed.
  • the solution is for only a portion of the source frames to be displayed.
  • one half (30/60) of the source frames need to be eliminated somehow.
  • the host device 12 of FIG. 1 loads a micro-coded engine into the video data compression device 48.
  • the micro-coded engine would include the values for the variables that govern the temporal transformation for this particular example.
  • Another operation that the video data compression device 48 may perform is a compression routine.
  • compression routines exist which can allow one to transmit the same image using fewer bits of data. Compression routines can compromise accuracy for speed and size, but the induced errors may not be significant given the context.
  • the video data compression device 48 will not only be able to compress the video data on a frame by frame basis, but will also be able to change compression routines within a single frame of data.
  • the compression routine can either be predefined by the manufacturer and fine tuned by the host device 12 of FIG. 1 or, preferably, the host device 12 will be able to program the video data compression device 48 as needed by the specific application that the host device 12 is set to perform. For example, if the host device 12 is set to transmit a video image via a modem to a second party, then the video data compression device 48 will be configured to compress the information in the scaled digital video data signal in such a way that improves the speed of transmission to the second party via the modem.
  • the program for such a compression routine is well within the capabilities of those of ordinary skill in the art.
  • FIG. 4 a block diagram of the audio remote device 20 from FIG. 1 is shown in further detail.
  • the analog audio remote device 20 starts with an analog audio source 80 which generates an analog audio data signal.
  • the analog audio data signal is transmitted over line 82 to a conventional audio data A/D converter 84 which samples the analog audio data signal and converts it to a digital audio data signal.
  • the digital audio data signal is transmitted over line 86 to an audio data compression device 88.
  • compression of the digital audio data signal may not be necessary and may be ignored. In such applications it may be cheaper to also eliminate the audio data compression device 88 altogether.
  • the audio data compression device 88 may optionally compress or buffer the data in a manner similar to that described above with respect to the video data which is well within the capabilities of those of ordinary skill in the art.
  • the digital audio data signal is transmitted over the line 26 of FIG. 1 to the interface device 24 of FIG. 1.
  • the audio remote device 20 will also be capable of audio playback and not just recording.
  • FIG. 5 the block diagram of the audio- video system 10 is shown again consisting of the host device 12, the remote device 14, and the interface cable 16. More specifically, FIG. 5 shows the details of the interface cable 16 between the host device 12 and the remote device 14 and the signals that are carried on the various conductors that make up the interface cable 16.
  • the interface cable 16 consists of nine conductors contained in a single cable bundle.
  • the first conductor 100 carries a standard clock signal.
  • the clock signal preferably cycles at 7 MHz.
  • the second conductor 102 carries a host control signal.
  • the host control signal contains commands from the host device 12 to the remote device 14.
  • the third conductor 104 carries a remote control signal.
  • the remote control signal contains responses from the remote device 14 to the host device 12. Examples of the commands and responses will be discussed with respect to FIGs. 6 and 7 below.
  • the eighth conductor 114 carries a standard ground signal.
  • the ninth conductor 116 carries a constant voltage potential.
  • the ninth conductor preferably carries a constant voltage potential of five volts.
  • the top line 120 shows the system clock signal.
  • the system clock signal is generated by the Peripheral Bus of the host device 12 of FIG. 5.
  • the second line 122 shows the host clock signal.
  • the host clock signal can either be independently generated or, preferably, be based on the system clock signal.
  • a command is one word or four nibbles long and one nibble can be transmitted per clock cycle over the data bus, so a command takes four consecutive clock cycles to transmit.
  • the host device 12 of FIG. 5 takes the host control signal carried on conductor 102 of FIG. 5 from its normally high state to a low state on the falling edge of the first and third cycles of the word. This is shown on the third line 124 as "a" and "c" respectively.
  • the remote device 14 of FIG. 5 acknowledges the command from the host device 12 by taking the remote control signal carried on conductor 104 of FIG. 5 from its normally high state to a low state on the rising edge of the first and third cycles of the word. This is shown on the fourth line 126 as "b" and "d” respectively.
  • the host device 12 transmits and the remote device 14 receives the command over the data bus of lines 106, 108, 110, and 112 respectively of FIG. 5. This is shown on the fifth line 128 as “cO”, “cl”, “c2”, and “c3" respectively.
  • the commands accomplish five types of operations.
  • the first operation is a simple reset of the remote device 14 of FIG. 5. No data is transferred in this operation.
  • a reset operation involves only the issuance of a reset command from the host device of FIG. 5 to the remote device 14 in the manner described above.
  • the reset operation puts the remote device 14 into a predefined mode of waiting for further commands from the host device 12.
  • the second and third operations comprise the read protocol shown in FIG. 6.
  • the read protocol has two modes. The first is a "word" mode and the second is a "stream" mode.
  • word mode the host device 12 of FIG. 5 reads one word from one register of the remote device 14 of FIG. 5.
  • stream mode me host device 12 reads multiple words in succession from one register of the remote device 14.
  • the read protocol is used to retrieve captured audio or video data from the remote device 14 for use in the host device 12 in any of d e various applications that might be running in the host device 12.
  • a read word operation begins with die issuance of a read word command from the host device 12 of FIG. 5 to d e remote device 14 of FIG. 5 in the manner described above.
  • die host device 12 surrenders control of d e data bus to d e remote device 14.
  • the remote device 14 designates ti at the word to be read is being placed on die data bus by taking the remote control signal low on the rising edge of die first cycle of the word. This is shown on the fourth line 126 as "e”.
  • the host device 12 acknowledges the designation by taking the host control signal low on the falling edge of d e first cycle of the word. This is shown on die third line 124 as "f '.
  • the remote device 14 transmits and die host device 12 receives the word over the data bus. This is shown on the fifth line 128 as “dO", “dl “, “d2”, and “d3" respectively. After d e four nibbles diat make up the word are transmitted and received, the read word operation is complete. The remote device 14 tiien surrenders control of the data bus to d e host device 12 and waits for another command from me host device 12.
  • a read stream operation begins with the issuance of a read stream command from the host device 12 of FIG. 5 to the remote device 14 of FIG. 5 in the manner described above.
  • d e host device 12 surrenders control of die data bus to the remote device 14.
  • the remote device 14 designates tiiat a word to be read is being placed on die data bus by taking the remote control signal low on the rising edge of d e first cycle of the word. This is shown on d e seventii line 132 as "el”.
  • the host device 12 acknowledges die designation by taking the host control signal low on the falling edge of d e first cycle of the word. This is shown on d e sixdi line 130 as "fl".
  • the remote device 14 transmits and d e host device 12 receives die word over the data bus. This is shown on the eighth line 134 as “dOl”, “dll”, “d21”, and “d31” respectively.
  • die remote device 14 transmits other words in a similar manner until d e host device 12 decides to terminate the read stream operation.
  • the last word transmitted in the read stream mode (die "nth" word) begins the same as the other words in die stream when the remote device 14 designates that a word to be read is being placed on die data bus by taking the remote control signal low on the rising edge of the first cycle of the word. This is shown on die seventh line 132 as "en”.
  • the host device 12 acknowledges the designation by taking the host control signal low on the falling edge of d e first cycle of the word. This is shown on die sixth line 130 as "fn”. Simultaneously, the remote device 14 transmits and die host device 12 receives the word over the data bus. This is shown on die eightii line 134 as "dOn”, “din”, “d2n”, and “d3n” respectively. The difference is that on die falling edge of die fourth cycle of die word, d e host device 12 issues die end of stream command to die remote device 14 to terminate the read stream operation by taking the host control signal low. This is shown on the sixth line 130 as "g”. This completes d e read stream operation. The remote device 14 then surrenders control of the data bus to die host device 12 and waits for another command from the host device 12.
  • the fourth and fiftii operations comprise the write protocol shown in FIG. 7.
  • the write protocol also has two modes. The first is a "word" mode and the second is a "stream” mode.
  • word mode d e host device 12 of FIG. 5 writes one word to one register of the remote device 14 of FIG. 5.
  • stream mode the host device 12 writes multiple words in succession to one register of die remote device 14.
  • the write protocol is used to load micro-code from the host device 12 to die remote device 14.
  • d e write protocol could be used for playback of previously captured audio or video data by the remote device 14.
  • a write word operation begins with die issuance of a write word command from the host device 12 of FIG. 5 to die remote device 14 of FIG. 5 in die manner described above.
  • the host device 12 designates that d e word to be written is being placed on d e data bus by taking the host control signal low on the falling edge of the first cycle of the word. This is shown on the tiiird line 136 as "e”.
  • the remote device 14 acknowledges die designation by taking the remote control signal low on the rising edge of the first cycle of the word. This is shown on d e fourth line 138 as "f '.
  • die host device 12 transmits and die remote device 14 receives the word over the data bus. This is shown on the fifth line 140 as "dO"
  • die write word operation is complete.
  • the remote device 14 tiien waits for another command from the host device 12.
  • a write stream operation begins with the issuance of a write stream command from the host device 12 of FIG. 5 to d e remote device 14 of FIG. 5 in the manner described above.
  • the host device 12 designates tiiat a word to be written is being placed on the data bus by taking the host control signal low on the falling edge of die first cycle of the word. This is shown on die sixth line 142 as "el”.
  • the remote device 14 acknowledges d e designation by taking the remote control signal low on the rising edge of d e first cycle of the word. This is shown on die seventh line 144 as "fl". Simultaneously, die host device 12 transmits and d e remote device 14 receives the word over the data bus.
  • die host device 12 transmits other words in a similar manner until the host device 12 decides to terminate e write stream operation.
  • the last word transmitted in die write stream mode (d e "ntii” word) begins die same as the other words in die stream when the host device 12 designates tiiat a word to be written is being placed on die data bus by taking the host control signal low on the falling edge of d e first cycle of the word. This is shown on die sixth line 142 as "en”.
  • the remote device 14 acknowledges die designation by taking the remote control signal low on the rising edge of die first cycle of the word. This is shown on die seventii line 144 as "fn". Simultaneously, the host device 12 transmits and die remote device 14 receives the word over die data bus. This is shown on the eighth line 146 as "dOn”, “din”, “d2n”, and “d3n” respectively. The difference is that on the falling edge of die fourth cycle of die word, die host device 12 issues the end of stream command to die remote device 14 to terminate the write stream operation by taking the host control signal low. This is shown on the sixth line 142 as "g". This completes the write stream operation. The remote device 14 tiien waits for another command from the host device 12.
  • the interface device 24 handles d e input/output protocol including serializing/deserializing die nibbles of information on the data bus, sequencing events according to the read/write protocols, and anticipating streaming transfers (either read or write).
  • botii d e read and write stream protocols operate on one register in the remote device 14.
  • the interface device 24 anticipates streaming transfers by either filling that register with data after it has been read or emptying that register of data after it has been written.
  • d e interface device 24 handles die resource sharing of all of the real time workings (audio and/or video) of die remote device 14.
  • Common resources of die remote device 14 include a memory (not shown) and die interface cable 16.
  • an arbiter (not shown) can manage access to these common resources in a hierarchical or round robin manner.
  • the interface device 24 handles register access by converting a numerical address into a physical location. Implementations of each of die above functions is well known to those of ordinary skill in the art and need not be described further herein.
  • host bus circuitry (not shown) which is well known to those of ordinary skill in the art for communicating between the interface cable 16 and a host communications bus (not shown) of the host device 12.
  • the host communications bus can be ISA, PCMCIA, PCI, or any of a number of other communications bus standards known to tiiose of ordinary skill in the art.
  • FIGs. 1 , 2, and 4 are only block diagrams, each of the lines 25, 26, and 27, and the interface cable 16 of FIG. 1, lines 32, 38, 42, and 46 of FIG. 2, and lines 82 and 86 of FIG. 4 are shown as single lines but may represent multiple conductive paths depending on die particular needs of die devices connected by die patiis.
  • line 32 would actually represent the video interface cable between the remote device 14 of FIG. 1 and die host device 12 of FIG. 1.
  • die digital preprocessor 40, die video data scaling device 44, and the video data compression device 48 would be located inside d e host device 12 of FIG. 1 along witii d e host bus circuitry on a dedicated video card (not shown).
  • the digital preprocessor 40, the video data scaling device 44, and die video data compression device 48 are located inside die remote device 14 of FIG. 1 diereby significantly reducing die elements located witiiin die host device 12 of FIG. 1 and die data tiiat must be transmitted to die host device 12 over the interface cable 16 of FIG. 1.
  • line 82 would actually represent the audio interface cable between the remote device 14 of FIG. 1 and die host device 12 of FIG. 1.
  • die audio data A/D converter 84 and the audio data compression device 88 would also be located inside d e host device 12 on d e dedicated video card.
  • d e audio data A/D converter 84 and d e audio data compression device 88 are located inside die remote device 14 of FIG. 1 diereby further reducing the elements located witiiin the host device 12 of FIG. 1 and d e data tiiat must be transmitted to d e host device 12 over die interface cable 16 of FIG. 1.
  • the host bus circuitry can be an additional IC on d e modem card.
  • the host bus circuitry can simply be a number of gates programmed into an existing IC on die modem card. In either case, the circuit layout and design is well within the capabilities of those of ordinary skill in the art.
  • die present invention avoids taking up a valuable expansion slot which can now be put to otfier uses. Also, it is increasingly the case that computer manufacturers pre-install modems in their new computers at the factory. If the interface mating receptacle and die host bus circuitry are included on the modem card, then tiiere is no additional installation required by the eventual computer owner under d e present invention.
  • the interface cable 16 would actually be located inside die host device 12 on die dedicated video card.
  • line 32 of FIG. 2 as the video interface cable and line 82 of FIG. 4 as die audio interface cable between the host device 12 and die remote device 14.
  • this can total twenty six conductors as die present interface cable which would be more rigid and heavier tiian die fewer conductors of die interface cable 16 of the present invention.
  • die interface cable only carried data from the remote device 14 to die host device 12. There was no control over the direction or the content of the flow of data.
  • captured audio and/or video data is stored in die remote device and scaled and compressed if desired. This greatly reduces d e amount of data carried by die interface cable 16.
  • die audio and/or video data is carried on die same set of wires, 106, 108, 110, and 112 respectively, and not separate wires as in the present. By reducing die number of wires, the present invention decreases the chance of generating interference problems between the wires.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

L'invention porte sur une interface vidéo et audio perfectionnée pour ordinateur comprenant un dispositif interface et un câble interface, et sur un système utilisant ladite interface et comportant un dispositif hôte, un dispositif à distance et un câble interface placé entre le dispositif hôte et le dispositif à distance pour transmettre des données du dispositif à distance au dispositif hôte, et des instructions du dispositif hôte au dispositif à distance. Le dispositif à distance, qui peut transmettre des données au dispositif hôte via l'interface, comporte le dispositif interface, un dispositif de mise à l'échelle des données vidéo, un dispositif de compression des données vidéo, un convertisseur A/N des données audio et un dispositif de compression des données audio. Le dispositif hôte est capable de transmettre des données au dispositif hôte via l'interface, et de configurer sélectivement via l'interface le dispositif de mise à l'échelle des données vidéo et le dispositif de compression des données vidéo.
PCT/US1998/008916 1997-04-30 1998-04-30 Interface video et audio pour ordinateur WO1998049835A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US84643097A 1997-04-30 1997-04-30
US08/846,430 1997-04-30

Publications (1)

Publication Number Publication Date
WO1998049835A1 true WO1998049835A1 (fr) 1998-11-05

Family

ID=25297909

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1998/008916 WO1998049835A1 (fr) 1997-04-30 1998-04-30 Interface video et audio pour ordinateur

Country Status (1)

Country Link
WO (1) WO1998049835A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1981262A1 (fr) * 2007-04-02 2008-10-15 Research In Motion Limited Caméra avec plusieurs viseurs

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0617542A2 (fr) * 1993-03-22 1994-09-28 Canon Kabushiki Kaisha Dispositif d'enregistrement et de réproduction
DE19500516A1 (de) * 1994-01-11 1995-09-28 Thomson Brandt Gmbh Videokamera
WO1996002106A1 (fr) * 1994-07-09 1996-01-25 Vision 1 International Limited Camera video pour reseaux numeriques
EP0754994A2 (fr) * 1995-07-21 1997-01-22 Canon Kabushiki Kaisha Système de contrÔle et unités amovibles attachables à celui-ci
EP0776130A2 (fr) * 1995-11-27 1997-05-28 Canon Kabushiki Kaisha Système de contrÔle de caméra avec fréquence de trame variable

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0617542A2 (fr) * 1993-03-22 1994-09-28 Canon Kabushiki Kaisha Dispositif d'enregistrement et de réproduction
DE19500516A1 (de) * 1994-01-11 1995-09-28 Thomson Brandt Gmbh Videokamera
WO1996002106A1 (fr) * 1994-07-09 1996-01-25 Vision 1 International Limited Camera video pour reseaux numeriques
EP0754994A2 (fr) * 1995-07-21 1997-01-22 Canon Kabushiki Kaisha Système de contrÔle et unités amovibles attachables à celui-ci
EP0776130A2 (fr) * 1995-11-27 1997-05-28 Canon Kabushiki Kaisha Système de contrÔle de caméra avec fréquence de trame variable

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
BECK J ET AL: "AN OBJECT BASED ARCHITECTURE FOR A DIGITAL COMPRESSION CAMERA", DIGEST OF PAPERS OF THE COMPUTER SOCIETY COMPUTER CONFERENCE (SPRING) COMPCON, TECHNOLOGIES FOR THE INFORMATION SUPERHIGHWAY SAN FRANCISCO, MAR. 5 - 9, 1995, no. CONF. 40, 5 March 1995 (1995-03-05), INSTITUTE OF ELECTRICAL AND ELECTRONICS ENGINEERS, pages 179 - 185, XP000545429 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1981262A1 (fr) * 2007-04-02 2008-10-15 Research In Motion Limited Caméra avec plusieurs viseurs

Similar Documents

Publication Publication Date Title
US6222885B1 (en) Video codec semiconductor chip
US20060164534A1 (en) High-speed digital video camera system and controller therefor
US5369617A (en) High speed memory interface for video teleconferencing applications
WO2001080560A1 (fr) Traitement video multi-format
US6357047B1 (en) Media pipeline with multichannel video processing and playback
US20040189809A1 (en) Digital imaging apparatus and method for selecting data transfer mode of the same
CA2260932C (fr) Circuit de traitement de videos animees concu pour saisir, visualiser et manipuler des informations videos animees numeriques sur un ordinateur
JPH09179537A (ja) コンピュータ・ディスプレイ用ループバック・ビデオ・プレビュー
EP0122094B1 (fr) Dispositif électronique de stockage d'images fixes avec triage à grande vitesse et sa méthode de fonctionnement
US7110018B2 (en) Communication terminal device and control method thereof
CN100574386C (zh) 能执行附加操作的数字音频/视频装置和方法
EP1031088A1 (fr) Systeme et procede permettant de transmettre des signaux r-v-b dans un systeme informatique a plusieurs utilisateurs
US7158140B1 (en) Method and apparatus for rendering an image in a video graphics adapter
WO1998049835A1 (fr) Interface video et audio pour ordinateur
JPH08139994A (ja) 画像合成システム
JP2785203B2 (ja) 編集装置
US5822013A (en) Selective projection image freeze device
KR100715522B1 (ko) 카메라 컨트롤 장치, 영상 데이터 표시 장치 및 그 방법
GB2179819A (en) Improvements in video signals processing systems
US6300972B1 (en) Video card and video data communication apparatus
JP2001237930A (ja) 情報処理装置及びその方法
US20050043064A1 (en) System and method for displaying data with a mobile phone
KR100282943B1 (ko) 디지털 비디오 카메라용 인터페이스회로
WO1998033316A2 (fr) Adaptateur pour camera video de tournage d'images mobiles permettant d'extraire des images numeriques mobiles et fixes
WO2000060867A1 (fr) Systeme video numerique rle

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): CA CN JP KR

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: JP

Ref document number: 1998547447

Format of ref document f/p: F

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: CA