WO1995011566A1 - Adaptive videoconferencing system - Google Patents

Adaptive videoconferencing system Download PDF

Info

Publication number
WO1995011566A1
WO1995011566A1 PCT/US1994/010968 US9410968W WO9511566A1 WO 1995011566 A1 WO1995011566 A1 WO 1995011566A1 US 9410968 W US9410968 W US 9410968W WO 9511566 A1 WO9511566 A1 WO 9511566A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
control signals
controller
amount
user
Prior art date
Application number
PCT/US1994/010968
Other languages
French (fr)
Inventor
Leo M. Cortjens
Kenneth A. Franklin
Richard C. Mays
Curtis M. Smith
Original Assignee
Videoconferencing Systems, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Videoconferencing Systems, Inc. filed Critical Videoconferencing Systems, Inc.
Priority to EP94929914A priority Critical patent/EP0724809A1/en
Priority to JP7511844A priority patent/JPH09506217A/en
Priority to AU79210/94A priority patent/AU7921094A/en
Publication of WO1995011566A1 publication Critical patent/WO1995011566A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices

Definitions

  • the present invention relates to videoconferencing systems and more particularly to a videoconferencing system which can accommodate a plurality of different devices and which provides for ease of operation by the user.
  • Typical prior art videoconferencing systems fall into one of two categories: those where the intelligence is centralized in the coder-decoder (codec) or a system control unit; and those where the intelligence is distributed so that each peripheral device controller has the intelligence necessary to directly control other peripheral devices in the system.
  • codec coder-decoder
  • One shortcoming of centralized intelligence systems is that such systems are not readily adaptable to accommodate new devices and new versions of existing devices. The addition of another peripheral device beyond the number originally planned for, or the addition of a new type of peripheral device, can require a substantial investment in time and money to accommodate the desired additional device or new device.
  • most centralized intelligence systems have a limited capacity with respect to the number of ports available to connect to peripheral devices. Once this capacity has been reached, new devices can be added only by removing existing devices, such as lesser used devices, or by obtaining another codec or system controller which can accommodate the increased number of devices.
  • Distributed intelligence systems such as that shown in U.S. Patent No.
  • pan, tilt, zoom and focus are industry standards which define the four major axes for which a camera may be adjusted. Traditional camera positioning provides for manual adjustment of these axes, as well as buttons which provide for automatically positioning the camera to a preset location. A preset function recalls the pan, tilt, zoom and focus settings that have been previously ascertained and stored for that preset location.
  • Traditional videoconferencing systems provide for rather rudimentary control of these camera functions. That is, the user has a control panel for manually controlling camera functions, such as buttons for up/down, left/right, zoom in/out, and focus.
  • the user can also typically select one of several preset camera settings so that, by the press of a single button, the camera will automatically position and focus itself at some preselected target.
  • the preset function requires planning because the camera must be manually adjusted for the preset, and then the settings stored. The preset button then merely recalls these settings and adjusts the camera accordingly. If a location has not been preset then the user must manually adjust the pan, tilt, zoom, and focus settings for that location.
  • the user may have to refocus the camera.
  • the first attempt to refocus the camera usually is in the wrong direction. That is, the user inadvertently defocuses the camera.
  • the learning process is short, but the need to focus creates delays and frustration.
  • the present invention provides a video teleconferencing system which combines a central intelligence with distributed intelligence to provide a versatile, adaptable system.
  • the system comprises a controller and a plurality of network converters. Each network converter is connected to a system network as well as to one or more peripheral devices.
  • the controller contains the software necessary for its own operation as well as the operation of each of the network converters.
  • the user selects the type of device that is connected to a network converter and the controller sends the software appropriate to that type of device to the network converter.
  • the network converter loads the software into its own memory and is thereby configured for operation with that type of device. This allows a network converter to be quickly programmed for a particular peripheral device. This also allows for quick and convenient upgrading of the system to accommodate new devices.
  • the controller Rather than having to design a new network converter for each type of new peripheral device, software for that new device is written and stored in the controller. The software can then be loaded into a network converter when that new device is added to the system. Therefore, existing network converters can be used to accommodate new devices. This reduces the number and type of network converters that must be maintained in inventory and also minimizes the obsolescence of network converters as new devices and new versions of existing devices become available.
  • the present invention provides that the controller will perform conversion of instructions from the initiating device, such as a mouse, to the controlled device, such as a camera. This allows for easy and convenient upgrading of the system to accommodate new devices because the peripheral devices do not need to understand the signals from other peripheral devices. The controller performs the necessary device-to-device signal translation.
  • one network controller will convert signals from a mouse into network standard control signals which represent the mouse movement, such as left, right, up, down, button 1 depressed, button 1 released, etc., regardless of the type of mouse being used.
  • the controller inspects these network standard control signals to determine the type of action requested by the user.
  • the controller then generates network standard control signals corresponding to the desired action and places these signals onto the network. Examples of network standard control signals intended for the control of a camera might be pan left, pan right, etc.
  • the camera network converter then performs a conversion of the network standard signals from the controller into the type of control signals required for that particular camera, such as +12 volts, -12 volts, binary command 0110, etc.
  • the new device may require control signals which are completely different than any existing device so the control signals presently provided by the camera network converter would not give the desired results.
  • the network standard signals do not change. Rather, new software is written for the camera network converter so that the camera network converter provides the appropriate signals to the new camera, such as +7 volts, -3 volts, binary command 100110, etc.
  • peripheral devices from different manufacturers and new peripheral devices are readily accommodated by adding new software for the controller. The user can then instruct the controller to load the new software into the converter so that the converter is now configured for the new device.
  • the present invention also provides for control of devices on remote systems.
  • the use of network standard signals allows a user at a local site to easily control a device at a remote site, even if the controller at the local site does not have software appropriate for that type of device.
  • the controller at the local site receives the network standard signals corresponding to the action taken by the user and determines the action (pan left, pan right, etc.) required at the remote site.
  • the local controller then sends the network standard signals for the action to the remote controller.
  • the remote controller receives the network standard signals from the local controller and sends these network standard signals to the remote network converter for the device, and the remote network converter does have the appropriate software for the remote device.
  • the remote network converter then converts the network standard signals into the signals appropriate for that type of peripheral device.
  • the present invention provides alternative methods of adjusting the pan, tilt, zoom and focus of a camera.
  • the user positions a pointer over an object displayed on a monitor and clicks a mouse button. This causes the camera to be automatically positioned so as to center the object in the monitor display.
  • the user uses the pointer to draw a rectangle around the object or area of interest. This causes the camera to be automatically positioned to center the object in the monitor display and adjust the zoom and focus so that the designated area in the rectangle fills the display. This is a substantial improvement over prior art systems in that a camera may be automatically positioned for objects or areas for which there are no preset values.
  • the present invention provides an improvement to panning.
  • the panning speed is automatically adjusted in accordance with the current zoom (field of view) setting.
  • zoom field of view
  • panning will occur at a slow rate so that objects do not fly by at high speed.
  • panning will occur at a fast rate so that objects do not crawl by at slow speed. The result is that, regardless of the zoom setting, objects appear to move across the scene at a fixed, comfortable rate, which is user selectable.
  • the present invention provides an improvement to panning and tilting the camera.
  • the time to complete the change in the pan position is determined and the time to complete the change in the tilt position is determined. Then, the faster process is slowed down so as to be completed at the same time as the slower process. This causes the camera to move smoothly and linearly from the starting position to the ending position.
  • the present invention provides a method for automatically focusing the camera. Each time that the camera is positioned toward and manually focused on an object or area the system automatically stores the camera position and the focus setting. When the camera is next positioned toward the object or area the system automatically recalls the stored focus setting and implements that setting.
  • the present invention defines relationships between regions so that a focus setting may be determined even if that region has not been used before.
  • the present invention further provides for automatic selection of the camera to be controlled.
  • the user simply positions a pointer over the desired scene and the system automatically selects, for further control, the camera which is providing that scene. This method is particularly useful when picture-within-picture, split screen, and four-quadrant screen displays are in use.
  • Figure 1 is a block diagram of the preferred embodiment of the present invention.
  • Figure 2 is a block diagram of a serial interface-type network converter.
  • Figure 3 is a block diagram of a parallel interface-type network converter.
  • Figure 4 is a block diagram of a specialized-type network converter.
  • Figures 5A and 5B are a flow chart of the method used for positioning a camera.
  • Figures 6A and 6B are an illustration of the operation of the automatic zoom feature of the present invention.
  • Figure 7 is a flow chart of the method for controlling the aim point and the zoom operation of the camera.
  • Figure 8 is a schematic block diagram of a video unit control node.
  • Figure 9 is a schematic block diagram of an audio unit control node.
  • Figures 10A-10C are illustrations of the relationship between regions.
  • Figures 11A and 1 IB are a flow chart of the camera focusing process.
  • Figure 12A is an illustration of the preferred embodiment of a camera of the present invention.
  • Figure 12B is an illustration of the feedback system associated with the camera controls.
  • Figure 13 is an illustration of a two-monitor videoconferencing system of the present invention.
  • FIG. 1 is a block diagram of the preferred embodiment of the present invention.
  • the videoconferencing system comprises a controller 10, a plurality of network converters (C) 11 A- UK connected to a network 23, a mouse 12, a control panel 13, an audio unit control node 14, a video unit control node 15, a coder-decoder (codec) 16, a camera unit control node 17, a joystick 18, a power supply 19, a video cassette recorder/playback unit (VCR) 20, monitors 21, and a modem 22.
  • the video teleconferencing system also comprises items which, for the sake of clarity, are not shown in Figure 1, such as: cameras, pan/tilt and zoom/focus units for the cameras, microphones, speakers, audio cabling, video cabling, and telephone and power wiring.
  • Each device 10, 12-22 is connected to a converter 11A-11K.
  • the converters are connected, preferably in a daisy-chain (serial) manner, via the network designated generally as 23.
  • Converter 11A is shown as part of controller 10, and converters 11B-11K are shown as being stand alone components which are separate from their respective connected devices 12- 22. However, this is merely a preference and any converter 11 may be a stand alone component or may be a part of its associated device.
  • the network 23 is the LON-based network developed by Echelon, Inc., Palo Alto, California. However, other networks, such as Ethernet, may be used.
  • Each converter 11 contains information which either converts network standard signals on network 23 into control signals for the connected device 10, 12-22, converts control/status signals for the connected device(s) into network standard signals for network 23, or both.
  • network controller 11B will convert signals from the mouse 12 into network standard control signals which represent the mouse movement, such as left, right, up, down, button 1 depressed, button 1 released, etc.
  • Network converter 11B provides the same network standard control signals for a particular type of mouse movement regardless of the type of mouse being used.
  • network standard control signals from control devices such as mouse 12, control panel 13, joystick 18 or codec 16, are sent, via converters 11 and network 23, to controller 10.
  • a single converter can service two or more devices, such as converter 11B servicing mouse 12 and joystick 18, and converter 111 servicing two monitors 21 A and 22B.
  • converter 11B also sends information as to whether the activity is associated with the mouse 12 or the joystick 18.
  • the controller 10 then inspects these network standard control signals to determine the type of action requested by the user and the device which should take the action, generates network standard control signals corresponding to the desired action, and places these signals onto the network 23.
  • a converter 11 inspects the address of the incoming network standard signals on the network 23 to determine if the data is intended for that converter or its connected device.
  • the converter 11 will capture the data, which is a network standard control signal representing the desired action, and convert the data into the appropriate type of signal for the connected device.
  • a network standard control signal representing the desired action
  • convert the data into the appropriate type of signal for the connected device.
  • the mouse movement signals are converted by converter 1 IB into network standard control signals indicating, for example, the direction of the movement of the mouse and the status of the buttons on the mouse (depressed, not depressed).
  • Converter 11B then generates an address for controller 10 and places these network standard signals on network 23.
  • Converters 11C-11K ignore these signals because the address indicates that the signals are not for them.
  • Converter 11A recognizes the address as its own, captures these signals, and provides the signals to controller 10. Controller 10 determines that the network standard control signals signify a mouse movement corresponding to an instruction for the selected camera to pan to the left and, accordingly, generates network standard control signals corresponding to such camera movement. Controller 11 then instructs converter 11A to address these signals to the network converter for pan/tilt unit control node 17 and to place these signals on network 23.
  • Converter 11G recognizes the address as its own (or as intended for its connected pan/tilt device), and captures the network standard signals. Converter 11G then generates control signals appropriate for the type of pan mechanism (not shown) used with the selected camera.
  • the network standard signals from the mouse or to the pan/tilt mechanism will not change. Rather, the network converters 11 will convert the signals from the mouse 12 into network standard signals and will convert the network standard signals into signals appropriate for the pan/tilt mechanism.
  • the signals from mouse 12 may indicate that mouse 12 is being moved to the left at a certain rate and the appropriate signals provided to the pan motor may be +12 volts or, if the pan motor has a digital controller or interface, the signals provided by converter 11G may be a binary signal such as 101011 or some other code which corresponds to the code and format required to achieve the specified action.
  • controller 10 may not be required and converters 11B and 11G may be programmed to achieve the desired correspondence between the movement of the mouse 12, the depression of keys on control panel 13, and movement of the pan motor.
  • mouse 12 is also used to specify functions which do not have a one-to-one correspondence between mouse movement and pan motor action, such as the point-and-click and the draw-and-release operations described below and therefore all network signals are directed to or come from controller 10.
  • monitor control node 21 is addressed by converter 111 to controller 10 (converter 11 A) and then placed on network 23. Controller 10 then inspects the status information to determine if the selected monitor (not shown) is in the proper mode, such as on or off.
  • Control panel 13 is a conventional videoconferencing system control panel, well known in the art, and provides, via buttons, such functions as pan left, pan right, tilt up, tilt down, mute on/off, zoom in/out, focusing, presettable camera settings, and volume up/down.
  • Audio unit control node 14 controls the flow of audio signals among the devices which send or receive audio signals, such as microphones, speakers, codec 16, telephone lines, and VCR 20.
  • Video unit control node 15 controls routing of video signals among the different devices which send or receive video signals such as codec 16, VCR 20, cameras, and monitors 21.
  • Codec 16 provides conventional codec functions.
  • Camera unit control node 17 controls the pan, tilt, zoom, and focus of the cameras and provides feedback regarding these parameters.
  • Power supply 19 provides operating power for the converters 11 and also for the other devices 10, 12-18, 20-22 connected to the system.
  • VCR 20 is a conventional video cassette recorder/playback device.
  • Monitors 21 are commercially available monitors and, in the preferred embodiment, are Mitsubishi color televisions, model CS-35EX1, available from Mitsubishi Electronics America, Inc., Cypress, California.
  • Modem 22 is a conventional modem, preferably having a data communications rate of at least 9600 bits per second.
  • a typical codec 16 has a port for connection to one or more dial-up or dedicated telephone lines.
  • codecs have a data port which can also be used for transferring data as well as for setting up the codec. This data port is advantageously used in the present invention to allow a codec 16 to be configured by the controller 10.
  • codec 16 is a type Visualink 5000, manufactured by NEC America, Inc., Hillsboro, Oregon.
  • Controller 10 will, via converters 11A and 11F and network 23, instruct codec 16 to dial up or otherwise access the remote codec (the codec at the other videoconferencing location). Codec 16 will then attempt to establish communications with the remote codec. If communications are successfully established the codecs will negotiate what features will be used and then the session may begin. However, if communications cannot be established, such as because the codecs are configured for different protocols, the local codec 16 will report to controller 10 that codec 16 was able to contact the remote codec but was unable to establish communications (handshake) with the remote codec because the remote codec was using a different protocol.
  • Controller 10 will then, via converters 11A and UN, instruct modem 22 to dial up the remote modem (the modem for the videoconferencing system at the other location). Once controller-to-controller communications have been established via modem then controller 10 can instruct the remote controller to configure the remote codec for a particular protocol. The remote controller will take action, if necessary, to configure the remote codec to the same protocol. Conversely, controller 10 can receive information from and/or negotiate with the remote controller as to the protocol(s) supported by, or the current configuration of, the remote codec and then configure codec 16 to the same protocol as the remote codec. Then, controller 10 can again instruct codec 16 to establish communications with the remote codec and, as both codecs have now been configured to the same protocol, the codecs can establish communications and negotiate features, and the videoconferencing session can begin.
  • controller 10 may also communicate with a similarly situated controller at a remote site (not shown) via the data port on codec 16.
  • the user using mouse 12, control panel 13, or joystick 18, may command a particular action to be performed at the remote site, such as panning the remote camera to the left or right, tilting the remote camera up or down, etc.
  • the user's actions are converted into network standard control signals and these signals are sent by converter 1 IB to controller 10.
  • Controller 10 determines the action required at the remote site and sends, via network 23 and codec 16, network standard control signals corresponding to the action to the remote controller.
  • the remote controller then sends, via its own network, the network standard signals to the converter for the remote pan/tilt unit.
  • the remote converter then generates the appropriate instruction for the remote pan/tilt unit control node which, in turn, causes the pan/tilt mechanism for the selected remote camera to perform the action specified by the user at the local site.
  • the user at the local site can therefore control all of the functions of all the devices at the remote site that the remote user can control at the remote site, even if the remote site has devices available which are not available at the local site.
  • some functions at a site are preferably controlled only by the user at that particular site, such as microphone muting, monitor on/off operation, and speaker volume control settings.
  • the present invention also provides for system diagnostics.
  • camera unit control node 17 in addition to receiving instructions from controller 10, also reports the results of an instruction to controller 10.
  • Each pan/tilt unit has a position indicator, either as part of the unit or as a retrofit device.
  • the position indicator indicates the current pan position and the current tilt position.
  • the camera unit control node 17 accepts the position signals from the position indicator and provides these signals to the controller 10.
  • Controller 10 inspects these signals to determine whether the selected pan/tilt unit is taking the proper action with respect to the control signals. For example, assume that controller 10 has instructed a particular pan/tilt unit to pan in a certain direction at a certain rate but that the pan/tilt unit either does not pan, or pans at a different rate.
  • the camera unit control node 17 reports the response of the selected pan/tilt unit to controller 10. If the response of the selected pan/tilt unit is improper then controller 10 will cause a report to be generated which alerts the system operator to the problem.
  • the report may be provided in a number of ways. For example, the presence of the report may be indicated by an icon on the screen of a monitor 21. This alerts the system operator to select the report to ascertain the nature of the problem. Or, the controller 10 may cause a report to be printed, either by a printer (not shown) connected to a printer port on controller 10 or by a printer (not shown) connected as another device on the network 23. The report may also indicate the severity of the problem.
  • Modem 22 also allows for remote diagnostics and reporting. If the videoconferencing system is, for example, being serviced by a remote party then the remote party can, using a personal computer and a modem, call up modem 22, establish communications with controller 10, and instruct controller 10 to send, via modem 22, the current system diagnostics.
  • controller 10 can be programmed to use modem 22 to call up the remote party, establish communications with the remote computer, and automatically send the current system diagnostics.
  • the programming may specify that the call is to be performed at a certain time of day, such as during off-duty hours, or whenever a serious failure occurs, such as the complete failure of a pan/tilt unit, or both.
  • the controller-to-controller communications via either codecs or modems, also allows the controller at one site, such as a remote site, to inform the controller at another site, such as the local site, that a particular device or function is inoperative at the remote site. Then, when the user attempts to use that device or function the local controller will disregard the instructions from the user and inform the user that that device or function is out of service.
  • Controller 10 in addition to performing system diagnostics, also attempts simple system repairs. For example, if the pan/tilt unit will not pan in one direction, controller 10 will instruct the pan/tilt unit to pan in the other direction so as to attempt to dislodge any cable which may be snagged. If this action is successful and the pan/tilt unit is then operational controller 10 will log the failure and the repair so that the service technician will know to inspect that unit for loose or snagged cables and to service that unit. If the action is not successful then controller 10 will disregard future instructions from the user as to the desired movement of that pan/tilt unit and will not attempt to send further instructions with respect to the failed function. That is, pan instructions will not be sent because the pan function is not operative, but tilt instructions may be sent because that function still operates properly. However, as another option, controller 10 may be programmed to cause operating power to be entirely removed from the failed pan/tilt unit.
  • the camera unit control node 17 also controls the zoom and focus of the connected cameras (not shown).
  • the cameras have a zoom position indicator and a focus position indicator, either as part of the unit or as a retrofit device. Controller 10 can therefore determine whether a selected camera is operating properly.
  • each monitor 21 has an on/off indicator, described below, and converter 111 reports the status of each monitor. Controller 10 can therefore determine whether a selected monitor is on or off.
  • codec 16 performs limited self-diagnostics on its own operation. Controller 10, either in response to an error signal from codec 16, or at periodic intervals, will instruct codec 16 to report its status.
  • Controller 10 can then take the appropriate reporting action, if any is required, and/or switch to another codec (not shown) connected to network 23.
  • the LON network is used because converters 11, in general, draw operating power via the network 23 and do not require a separate source of power nor require power from the connected device. This is advantageous in that the network and the system will continue to function even if a connected device, such as VCR 20 or modem 22, is removed from the network or is powered down.
  • a power supply 19 is connected to the network 23 and provides operating power for the converters 11. Power supply 19 also provides operating power, such as 110 VAC or 12 VDC, to each peripheral device. This operating power may be provided via network 23 or provided via separate power cables to each peripheral device.
  • Power supply 19 provides AC and DC power, as required, to each peripheral device. Power supply 19 is connected to converter UK and may therefore be controlled by the user. This allows the user to turn on and turn off selected peripheral devices, as desired, by removing operating power from the device. This provides an additional way of turning off a device if the device is otherwise non-responsive to signals sent via network 23, and also provides a safety factor in that the user can completely remove operating power from a device. Further, in the preferred embodiment, converter UK has an internal timer. If there is no user activity, signified by a lack of activity of mouse 12, control panel 13, or joystick 18, then converter UK will send a "sleep" signal to controller 10. This causes controller 10 to go into a standby mode, thereby conserving power.
  • Converter UK will also instruct power supply 19 to remove operating power from the peripheral devices. Although converter UK and power supply 19 are shown as separate devices, it will be appreciated that both functions may be performed by a single device.
  • power supply 19 is not responsive to signals on network 23 but merely provides operating power for the converters 11.
  • either controller 10 or converter UK may have the internal timer.
  • power supply 19 is not used and controller 10 has the internal timer, and also provides operating power for the converters 11 on network 23 via the connection to converter 11 A.
  • controller 10 is a personal computer, such as a COMPAC Prolinea, having a 120 megabyte hard drive, a 4 megabyte random access memory, and a 3-1/2-inch floppy disk drive. Controller 10 does not need to have a screen or a keyboard because, in the preferred embodiment, a monitor 21 is used as a screen, and mouse 12 and control panel 13 may be used in place of a keyboard. However, if desired, a screen and a keyboard could be connected directly to controller 10.
  • a monitor 21 is used as a screen
  • mouse 12 and control panel 13 may be used in place of a keyboard.
  • a screen and a keyboard could be connected directly to controller 10.
  • mouse 12, control panel 13, and joystick 18 are shown as being connected to converters 11B and 11C by wiring, it will be appreciated that there are commercially available devices 12, 13, and 18 which do not have a wire connection but, instead, communicate by infrared(IR) signals. These devices may also be used with the present invention.
  • the appropriate network converter 11 would have an IR receiver, would respond to the infrared signals, and would provide the corresponding network standard signals to controller 10.
  • Converter 111 would then be a specialized purpose converter.
  • a specialized purpose converter is described below which transmits IR signals to IR receivers in monitors 21. In this case, the role of transmitter and receiver is reversed, that is, the devices 12, 13, 18 transmit and the converters 11B, 11C receive.
  • Converters 11 fall into three general classes: serial interface, parallel interface, and specialized purpose.
  • a codec 16 is a serial interface device and therefore converter 11F would be a serial interface- type converter
  • a VCR 20 may have a parallel interface and therefore converter 11H would be a parallel interface-type converter.
  • monitors 21 are of the type which can be remotely controlled by, for example, a handheld infrared remote control.
  • Converter 111 is therefore a specialized type of converter in that it can provide the infrared signals necessary to control the monitors 21 and has the necessary components for monitoring the state of operation of the monitors 21.
  • Figure 2 is a block diagram of a serial interface-type network converter 11.
  • a serial-type converter 11 comprises a network connector 40, a power supply/filtering circuit 41, an RS-485 transceiver 42, a parallel- serial and serial-parallel (P/S-S/P) converter 48, a microprocessor 43, a basic program memory 44, an installed program memory 45, a set-up button 46, a display 47, an RS-232 charge pump/transceiver 50, and a serial port connector 51.
  • Connector 40 is connected to network 23 and connector 51 is connected to a serial interface device, such as codec 16.
  • Power supply/filtering circuit 41 draws power from network 23 and provides filtered power to the several circuits of converter 11.
  • Transceiver 42 provides voltage level, balanced- to-single-sided (unbalanced), and single- sided-to-balanced conversion of the signals between network 23 and P/S-S P converter 48.
  • P/S-S/P converter 48 provides parallel-serial and serial- parallel conversion of the signals between transceiver 42 and the microprocessor 43.
  • microprocessor 43 is a Neuron microprocessor, manufactured by Motorola Semiconductor Products, Phoenix, Arizona and the P/S-S/P conversion functions of converter 48 are performed by the microprocessor 43.
  • Basic program memory 44 contains an identification number, such as a serial number, start-up procedures and basic operating instructions for microprocessor 43, such as instructing microprocessor 43 of the port or address of transceivers 42 and 50, button 46 and display 47.
  • memory 44 is a programmable read only memory (PROM).
  • Installed program memory 45 contains configuration information and operating instructions as to the conversion required between signals present on network 23 and the corresponding signals to be output via connector 51 , and vice versa. Examples of the type of information that may be installed in memory 45 are the voltage polarity and voltage levels required to control the connected peripheral device, the binary codes and format required to control the connected peripheral device, and similar information concerning signals that may be received from the connected peripheral device.
  • memory 45 comprises both an electrically erasable programmable read only memory (EEPROM) and a random access memory (RAM).
  • Button 46 is used to initialize (set up) converter 11, as explained in more detail below.
  • Display 47 is, in the preferred embodiment, a light emitting diode (LED) and is off when microprocessor 43 has been properly set up (configured), and flashes when microprocessor 43 is in the set up mode (not configured).
  • controller 10 contains, in its memory (not shown), a plurality of programs for the operation of converters 11. There is a separate program for each type of device that may be connected to a converter.
  • Converters 11F and 11 J are both serial interface-type converters. However, one is connected to codec 16 and the other is connected to modem 22, and therefore each requires different operating instructions so as to properly interface with the connected device. Therefore, for each type of converter, there is a separate program for each type of device which may be connected to that converter.
  • a program which may include software, firmware, data, formats, conversion codes, etc., is downloaded from controller 10 to the selected converter 11 so as to properly configure the converter 11 for the type (serial, parallel, specialized) of converter that it is and also for the type of device with which it will be connected.
  • This provides flexibility in that if a new type of device is to be connected to the network then a program is written for that type of device and loaded into controller 10. Controller 10 then downloads the program to the converter 11 which is connected to that new type of device. Therefore, in general, a serial interface-type converter can be used with any serial interface device by simply downloading the appropriate serial interface program from controller 10 into that converter 11, and likewise for parallel interface-type devices.
  • serial-type or parallel-type converters can be easily supported by using the appropriate generic (serial-type or parallel-type) converters and then causing controller 10 to download the appropriate programs to each of the added converters. This reduces the inventory of different types of converters that the user must have on hand to repair or add to the system.
  • memory 45 in a serial-type converter is a serial-type converter
  • converter 11 is not programmed with the installed program at manufacture, although it could be so programmed if desired. Therefore, when a converter 11 is first installed in the videoconferencing system and power is applied, the converter 11 will not be configured. Furthermore, if the user changes the type of serial device connected to the converter 11, such as disconnecting converter 11 from codec 16 and connecting converter 11 to modem 22, then converter 11 will be improperly configured for the newly connected device. Therefore, the user will press set up button 46, which causes microprocessor 43 to cause display 47 to begin blinking. Also, microprocessor 43 will send its identification number and type to controller 10 along with a network standard signal which advises controller 10 that converter 11 needs to be configured.
  • controller 10 The user will then go to controller 10 and, preferably using mouse 12, pull down an initial set up menu (not shown).
  • the set up menu will list the last converter 11 which has reported a need to be configured.
  • the user will pull down another menu which lists the types of serial interface devices supported by the videoconferencing system.
  • the connected serial device Once the connected serial device is identified by the user controller 10 will download, via network 23, the program necessary to allow converter 11 to interface between network 23 and the connected serial device.
  • Microprocessor 43 will install this program in the installed program memory 45.
  • Microprocessor 43 and memories 44 and 45 are shown as separate components for clarity but may be in a single device. If converter 11B has not been previously configured then a mouse, such as mouse 12, may be connected to a mouse control port on controller 10 in order to configure converter 11B.
  • FIG. 3 is a block diagram of a parallel-interface type network converter 11.
  • a parallel-type converter 11 is similar to that of a serial-type converter except that, instead of transceiver 50 and connector 51 , converter 11 will have an output transceiver 54 and a parallel connector 57.
  • Output transceiver 54 comprises output drivers 55 and input buffers 56.
  • transceiver 54 provides isolation between microprocessor 43 and the parallel interface device.
  • device 54 is preferably configurable by microprocessor 43 to select which pins on connector 57 are output pins and which pins are input pins.
  • transceiver 54 Devices which perform, or can be readily connected to perform, the functions of transceiver 54 are well known in the art. In the preferred embodiment, the functions of transceiver 54 are performed by the indicated Neuron microprocessor 43.
  • the operation of a parallel-type converter 11 is identical to that of a serial-type converter except that the inputs and outputs on connector 57 are configured for a device which is a parallel interface device, such as VCR 20.
  • FIG 4 is a block diagram of a specialized-type network converter, such as converter 111.
  • a specialized converter is useful in cases where the connected device does not have a serial or parallel interface or where that interface is already in use for some other purpose, but where there are also other means of controlling the device, such as by infrared signals or voltage level and/or polarity signals (analog signals).
  • Converter 111 which interfaces with monitors 21, is an example of a specialized converter.
  • a specialized-type converter has a connector 40 for connection to the network 23, a power supply/filtering circuit 41, an RS-485 transceiver 42, a microprocessor 43, a basic program memory 44, an installed program memory 45, a set up button 46, and a display 47.
  • specialized converter 11 has a driver 61, which is capable of driving infrared (IR) LEDs 62A and 62B. Only two IR LEDs are shown, corresponding to two monitors 21, for convenience, but more monitors 21 may be used. Each monitor 21 is, in the preferred embodiment, controllable by the use of infrared signals and has an infrared detector built into the monitor 21.
  • IR infrared
  • An IR LED such as 62A
  • driver 61 which provides the signals to the LED 62 A, which emits the infrared signals appropriate to cause monitor 21 to perform a particular action, such as turning on or off, turning the volume up or down if the speaker in monitor 21 is being used, adjusting brightness, contrast, etc.
  • a coil such as coils 63 A and 63B, is attached to each monitor 21.
  • a coil 63 is used to pick up the magnetic field of the horizontal deflection coils present in a monitor 21.
  • Coils 63 A and 63B are connected to amplifier/detectors 64 A and 64B, respectively.
  • An amplifier/detector 64 amplifies the signal provided by a coil 63 and detects (rectifies) the signal.
  • the output of each amplifier 64 is connected to buffer 65, which is connected to microprocessor 43. Buffer 65 provides any necessary buffering and voltage level shifting between the output of amplifier/detector 64 and microprocessor 43.
  • the on/off control signal in many monitors 21 is the same signal and the monitor 21 merely toggles between an on state and an off state.
  • a coil 63 is attached to the monitor 21 to pick up the radiation emitted by the horizontal deflection coil in that monitor 21. If the user sends an instruction to turn on a monitor 21 the microprocessor 43 will inspect the output of buffer 65 to determine if the coil 63 and amplifier/detector 64 associated with that particular monitor 21 are detecting radiation. If radiation is being detected then the monitor is already on and microprocessor 43 will not take any action. However, if monitor 21 is off then radiation will not be detected and, consequently, microprocessor 43 will cause driver 61 to pulse an LED 62 with the code required to toggle the on/off control of that monitor 21. Microprocessor 43 will then check the output from the coil 63 to determine if the operation was successful.
  • coils 63 are a type 70F103AI, manufactured by J. W. Millen, Collinso Dominguez, California. The positioning of the coils 63 on the monitors 21 is not extremely critical but it is preferred to place the coils 63 in a position to receive the maximum pick up when a monitors 21 is on so that the reliability of the on off indication is consistently high.
  • the basic program memory 44 may contain the necessary IR transmit instructions, and so install program memory 45, set-up button 46, and display 47 will not be needed. However, if converter 11 may be used with different types of monitors then the necessary instructions for the several types of monitors may be included in basic program memory 44 or, alternatively, the type of monitor being used may be selected from a pull ⁇ down menu at controller 10 and the necessary IR transmit program downloaded from controller 10 in memory 45.
  • FIGS 5A and 5B are a flow chart of the method used for positioning a camera.
  • the mouse 12 or the joystick 18 may be used to move a pointer within the display presented on a monitor, such as monitor 21A.
  • monitor 21A For convenience, only the operation using mouse 12 will be discussed although it will be appreciated that joystick 18, with control buttons thereon, can be used to accomplish the same result.
  • This particular method of positioning the camera is referred to herein as "point- and-click”. This phrase describes the action required by the user to reposition the camera. That is, using mouse 12, the user causes the pointer to be positioned (pointed) over the target of interest and then clicks a button on mouse 12.
  • Controller 10 then causes the selected camera to be aimed at the selected point so that the selected point is nominally in the center of the screen display seen by the user. This allows the user to quickly and easily designate where a selected camera should be pointing so that the user can conveniently view the desired object(s). It should be noted that this method is useful for both local cameras, that is, cameras which are at the same site as the user, and for remote cameras, that is, cameras which are at the remote site. Therefore, the user can easily adjust the remote camera to point at a desired object. This allows the user to focus a camera on a target of interest without having to instruct the person at the other end to stop whatever he or she is doing and position the camera as desired by the user.
  • a starting step 100 is shown but it will be appreciated that controller 10 performs many operations and therefore a starting step should be understood to be an entry point into a subroutine, such as a subroutine used for camera positioning.
  • decision 101 a test is made as to whether any mouse button 12 A, 12B is depressed. If so then the user is indicating that some function other than point-and-click camera positioning is to be performed and therefore other functions are tested and/or performed in step 102. If no mouse buttons are depressed then, in decision 103, a test is made for movement of the mouse. If there is no mouse movement then a return is made to decision 101.
  • decision 104 tests whether the pointer displayed on the screen of monitor 21 A is outside the area of the monitor designated for the picture. That is, is the pointer now positioned over a control bar, selection icon, other function symbol, a different picture (picture- within-picture), or a different monitor. If the pointer is outside the picture area then the user is indicating that other functions are to be performed and controller 10 proceeds to step 102 to perform the other functions. If the pointer is within the picture area then decision 105 tests whether a mouse button, such as mouse button 12 A, has been clicked. If not then a return is made to decision 101. If so then controller 10 determines in step 106 the amount of pan and tilt required to achieve the user's request.
  • a mouse button such as mouse button 12 A
  • Step 107 tests whether the amount of pan required is greater than the resolution error of the pan mechanism. That is, if the amount of pan required is one degree but the pan mechanism has a resolution error of two degrees, then panning should not be done. If panning is not to be done then decision 108 is executed. Decision 108 tests whether the tilt required is greater than the resolution error of the tilt mechanism. If the tilt required is not greater than the resolution error then a return is made to decision 101 because it has been determined that neither pan nor tilt is required. If, in decision 108, the tilt required is greater than the resolution error then step 112 is executed next.
  • step 110 the pan rate is determined. Then, in decision 111, a test is made as to whether the tilt is greater than the resolution error. If not then step 113 is executed next. However, if the tilt is greater than the resolution error then the tilt rate is determined in step 112. Although this process causes the movement along both axes to be completed at the same time, an undesirable affect may occur when moving long distances, such as from one preset location to another when the field of view is narrow. Assume, for example, that the field of view is 6 degrees, and the pan angle will be 60 degrees.
  • the pan rate is selected to cause the object to move across the field of view (6 degrees) in time T, then it will take 10T seconds for the camera to reach its destination. However, if the pan rate is selected to cause the camera to traverse the full distance in T seconds, then the 6 degree field of view will cause objects to fly across the scene in a blur. Therefore, in the preferred embodiment, if the camera is to pan over a long distance the camera is zoomed out (and focused accordingly) so that the camera has a wide field of view. The high speed pan rate will then allow the movement from start to finish to occur in a timely manner but, because the camera is zoomed out, an object will be reduced in size and will move at an acceptable rate across the display screen. At the end of the pan operation the camera is zoomed in (and focused accordingly) as specified by the destination location.
  • controller 10 determines whether the pan distance is sufficiently large to require zooming out. If not then step 115 is executed. If so then the camera is zoomed out and then step 115 is executed. In step 115 pan, tilt, and/or zoom, as required, are begun.
  • Step 116 tests whether the pan/tilt operation has been completed. If not then a return is made to decision 116. If the operation is complete then the zoom and focus are restored in step 117, if necessary, and the process of camera movement is ended in step 118.
  • the rate of pan and tilt are determined by considering the desired number of seconds that it should take an object to move from one end of the field of view to the other end of the field of view. In the preferred embodiment, this setting is programmable at controller 10.
  • the display is considered to have a 2x3 aspect ratio (V to H).
  • the pan speed will be set to 15 degrees per second and the tilt speed will be set to 10 degrees per second.
  • the camera will reach the desired position, with respect to both axes, at approximately the same time. This has the desirable effect of making the camera positioning appear smooth. Otherwise, the camera may reach the desired position with respect to one axis first, for example the vertical axis, and then have to continue moving with respect to the other axis until the desired location is achieved, which makes the camera movement appear awkward.
  • the point-and-click method of camera control is a major improvement over existing button methods of camera control.
  • the present invention provides an alternative form of movement. If this alternative form is selected by the user, such as by using a pull down menu or by pressing on a different mouse button such as button 12B, the camera will dynamically follow the pointer. In this case, if the pointer is moved slowly toward the side of the display controller 10 would cause the camera to slowly pan toward that side.
  • controller 10 When the pointer is positioned all the way to the side of the display, or at some predetermined border point, controller 10 instructs the pan/tilt unit to move at its maximum speed. Controller 10 automatically zooms out the camera when panning at high speed and automatically zooms in the camera to its original setting when the pointer is no longer at the side of the display and the pan speed is dropped to a slower rate. Of course, the user can adjust the zoom at any time.
  • Figures 6A and 6B are an illustration of the operation of the automatic zoom ("draw-and-release") feature of the present invention.
  • Figure 6 A is an illustration of a monitor 21 having a screen 125, which is displaying a person 126 sitting at the end of a table 127.
  • the user wishes to focus on the person 126.
  • the user could adjust the pan and tilt controls and then adjust the zoom and focus controls so as to zoom in on person 126.
  • the user will simply use the mouse 12 to place the pointer at the desired pointer starting point (PSP), depress and hold a predetermined mouse button, such as the left button 12 A, and drag the pointer across the area of interest, which causes a rectangular box to begin spreading across the screen, with one corner at the PSP.
  • a predetermined mouse button such as the left button 12 A
  • PSP pointer ending point
  • the user will release the mouse button. The user has thereby drawn a rectangle around the area of interest and released the mouse button.
  • Controller 10 will then determine the appropriate pan and tilt for a camera and cause the camera to center its field of view on the center of the rectangle (CR), then cause the camera to zoom in so that rectangle 128 fills, as fully as possible, screen 125, and also cause the camera to refocus, if necessary.
  • the resultant display is seen in Figure 6B, which illustrates that the camera has been repositioned so that CR is now in the middle of the display (MD). Therefore, by the simple tasks of positioning the pointer in one corner of the desired scene, depressing a mouse button, dragging the mouse to draw a rectangle, and releasing the mouse button, the user has caused the selected picture area to be expanded to fill the display 125.
  • the use of point, click, drag, and release techniques to draw a box, such as box 128, are, in general, well known in the personal computer field.
  • Figure 7 is a flow chart of the method for controlling the aim point and the zoom operation of the camera.
  • controller 10 tests, at decision 131 whether the appropriate mouse button has been depressed. If not then, in step 132, controller 10 tests for and/or performs other functions. If the mouse button has been depressed then, in step 133, controller 10 records the initial pointer position PSP. Then, in decision 134, controller 10 tests whether the mouse button has been released. If the mouse button has not been released then the user has not completed drawing the desired rectangle 128. Once the mouse button is released then the user has completed drawing rectangle 128 and has therefore designated the area of interest. Controller 10 therefore proceeds to step 135 and performs the following operations. First, the final pointer position PEP is recorded.
  • Controller 10 calculates the difference between the midpoint CR of rectangle 128 and the midpoint MD of display 125. These steps determine the pan and tilt required to center the desired picture on screen 125 and, although performed automatically, are analogous to the user moving the pointer to position CR and then clicking on the mouse, as in the procedure described with respect to Figure 5. Controller 10 then performs steps 106 through 117 of Figure 5 except that the "No" output of decision 108 does not return to step 101 but moves to substep 5 of step 135.
  • controller 10 has caused the camera to pan and tilt so as to place the center CR of rectangle 128 at the midpoint MD of display 125.
  • controller 10 must still determine how much zoom is required to satisfy the request of the user. Therefore, controller 10 determines the X- axis movement XM of the pointer and the Y-axis movement YM of the pointer. Controller 10 then adds the X-axis movement and the Y-axis movement to obtain the total movement of the pointer. Controller 10 then determines the ratio of the total movement (XM + YM) to the total size (XD + YD) of the screen 125 of monitor 21.
  • Controller 10 determines a new field of view by multiplying the above ratio times the current field of view. It will be appreciated that the current field of view is information which may be obtained from the zoom mechanism on the camera. Controller 10 then causes the camera to zoom to the new field of view or, if the new field of view is less than the minimum field of view supported by that camera, to zoom to the minimum field of view supported. Controller 10 then instructs the camera to focus, either by an auto focus process or by a memory process such as described below, and then the procedure ends.
  • the rectangle 128 illustrated in connection with Figure 6A has XM and YM proportions such that zooming in will cause rectangle 128 to nicely fill screen 125.
  • the user may not always draw such a well proportioned rectangle.
  • the user may draw a rectangle which is very wide and has minimal height or is very tall but has minimal width.
  • an alternative process must be followed.
  • One possible alternative approach is to expand rectangle 128 so that the larger of XM and YM is used to determine the zoom required.
  • FIG 8 is a schematic block diagram of a video unit control node 15.
  • video unit control 15 is connected to three cameras 150A-150C, three monitors 21A-21C, and a VCR 20. It should be understood that the number of cameras, monitors and VCRs is a design choice and is limited only by the video switching capability of node 15, which is primarily determined by cost considerations.
  • Video unit control node 15 selectively routes video signals from cameras 150, VCR 20, codec 16 and the auxiliary input, to monitors 21, codec 16, VCR 20 and the auxiliary output.
  • codec 16 has a motion input and a motion output, for scenes which frequently change, and a graphics input and a graphics output for scenes which infrequently change, such as slides and graphs.
  • Video unit control node 15 comprises a plurality of video input buffers 151 designated generally as 151, which are connected to the inputs of an 8x8 video switch matrix 152, which is connected to a plurality of output buffers designated generally as 153, a control logic 154, a video overlay device 155, a sync generator input lock signal buffer 160, a plurality of sync separators 161 A- 161 C, a sync generator and phase locked loop (PLL) circuit 162, and a black burst output distribution amplifier 164.
  • Buffers 151 which also perform DC restoration to the input signal, and buffers 153 buffer the incoming and outgoing video signals in a conventional manner.
  • switch matrix 152 switches the input signals from cameras 150, VCR 20, codec 16, the video overlay circuit 155, and the auxiliary input to the desired destination device, such as monitors 21, codec 16, VCR 20, and the video overlay circuit 155.
  • Control logic 154 is connected between converter HE and switch matrix 152.
  • converter HE extracts signals from network 23 which are intended for video control node 15 and converts the signals into the proper format for control node 15.
  • Control logic 154 accepts the signals from converter HE and sends corresponding control signals to switch matrix 152, sync generator and PLL circuit 160, and video overlay circuit 155.
  • Sync generator input lock signal buffer 160 has an input connected to a Genlock input signal, and an output connected to a sync separator 161A.
  • Sync separator 161 A in a well known manner, recovers and separates the vertical synchronization signals from the horizontal synchronization signals.
  • the output of buffer 160 and the output of sync separator 161 A are connected to inputs of sync generator and PLL circuit 162.
  • Circuit 162 provides a black burst output which is synchronized to the selected input signal. For NTSC signals the output of buffer 160 is used as the sync source, for PAL signals the output of sync separator 161 A is used as the sync source.
  • Control logic 154 directs circuit 162 as to which input signal should be used for synchronization.
  • the outputs of buffers 15 IC and 15 ID are connected to the inputs of sync separator circuits 161B and 161C, respectively.
  • the outputs of circuits 161B and 161C are connected back to inputs of buffers 15 IC and 15 ID, respectively, so that DC restoration is performed based upon the actual input signal.
  • the outputs of buffers 151 A, 151B, and 151E-151H could be provided to sync separator circuits, and the outputs of the sync separation circuits routed back to their respective buffers.
  • control logic 154 provides a sync signal to these buffers for DC restoration.
  • the sync signal provided by control logic 154 is preferably the sync signal provided by sync generator and PLL circuit 162.
  • Buffers 151A, 151B, and 151E-151H are preferably used as inputs from devices, such as cameras, which can be synchronized to an external source.
  • Buffers 15 IC and 15 ID are preferably used as inputs from devices, such as VCR's, which typically cannot be synchronized to an external source. Therefore, for devices which can be synchronized, DC restoration is performed based upon a master (Genlock) sync signal and, for devices which cannot be synchronized, DC restoration is performed based upon the sync signal from that device.
  • sync generator and PLL circuit 162 is connected to an input of control logic 154. This allows control logic 154 to determine the start of a video frame or the start of a line so that video switching occurs at the proper place in a picture. Also, some codecs require information as to the vertical interval within which switching is to occur and control logic 154 uses the signal from sync circuit 162 to provide this information as well.
  • the output of circuit 162 is connected to the input of a distribution amplifier 164 which provides several outputs G1-G4, which are black burst generator lock outputs. These outputs are used to synchronize cameras 150 so that the pictures from all cameras 150 are in sync.
  • Video overlay circuit 155 is used to provide special video effects such as picture within picture, and superimposed graphics and icons.
  • Video overlay circuit 155 may be part of control node 15, part of controller 10, or an independent device.
  • the auxiliary input is used to provide graphical user interface (GUI) information such as video icons, control "buttons” on the monitor display, control borders and pointers, etc.
  • GUI graphical user interface
  • this information is generated by controller 10.
  • Methods of generating GUI information are well known to those of ordinary skill in the art.
  • FIG. 9 is a schematic block diagram of an audio unit control node 14.
  • Control node 14 selectively routes audio signals from various sources to various destinations.
  • audio inputs are from an auxiliary input, left and right channel inputs from VCR 20, microphones 174A-174D, a telephone connection, and the audio output of codec 16.
  • Destinations for audio signals are, again by way of example, the record input of VCR 20, a telephone connection, and the audio input of codec 16. Any input audio signal may be routed to any desired destination and, likewise, any destination may receive any selected audio input signal.
  • All input and all output signals are buffered, either by a plurality of buffers/amplifiers designated generally as 173 or a mixing circuit 172.
  • the auxiliary input, the TELCO input, and the inputs from microphones 174A- 174D are buffered by buffers/amplifiers 173A-173C, respectively.
  • the input from codec 16 is buffered by buffer/amplifier 173E.
  • the inputs from VCR 20 are buffered by mixer 172 A.
  • the auxiliary input, the VCR 20 inputs, the TELCO input, the microphones 174A-174D inputs, and the codec 16 audio output are each passed through a muting circuit 170A-170E, respectively, and also through a gain control circuit 171A- 171H, respectively.
  • the auxiliary input, VCR input, and TELCO input are then provided to a plurality of mixers designated generally as 172C.
  • Mixers 172C contain separate mixers for the output to VCR 20, the output to the TELCO, and the output to the audio input of codec 16.
  • the inputs from microphones 174 are routed to a digital signal processing echo canceller 176.
  • the output of echo canceller 176 is then routed to the mixers 172C.
  • the outputs of three of the mixers of 172C are routed through gain control circuits 171I- 171K and buffers/amplifiers 173E before being provided to VCR 20, the TELCO connection, and the audio input of codec 16.
  • the audio output from codec 16 is routed through a gain control circuit 171H, a mute control circuit 170E, and then to the mixers 172C.
  • the output of the fourth mixer of mixers 172C is routed to the received input of echo canceller 176.
  • the received output of echo canceller 176 is routed through mute circuit 170F, gain control circuit 17 IL, and amplifier 173D, before being routed to speaker 175.
  • a mute circuit 170 comprises, as shown by mute circuit 170A, an analog switch.
  • the mute circuits 170 are controlled by control logic 177.
  • gain control circuits 171, such as gain control 171 A are digitally controlled gain circuits, and are controlled by control logic 177.
  • the user can use mouse 12 to pull down a menu and select a particular input or output device, and then select the gain or muting desired for that particular device.
  • the signals from mouse 12 are provided by converter 11B to controller 10.
  • Controller 10 interprets the mouse signals to determine the action requested by the user and, in this case, sends appropriate gain and mute signals to converter 11D.
  • Converter 11D extracts this information from network 11 and sends the appropriate control signals to control logic 177 which, in turn, supplies the appropriate signals to the gain circuits 171 and the mute circuits 170.
  • echo canceller 176 is an echo cancellation card manufactured by Gentner Communications Corporation, Salt Lake City, Utah. Echoes are typically caused by feedback between a speaker 175 and microphones 174 in a room, and is made more noticeable and distracting by the time delay caused by codec 16 and the additional delay which occurs when the signal is transmitted via satellite. Camera Focusing
  • the present invention allows the selection of the camera focus to be controlled by the position of the camera.
  • This feature establishes a database of the room layout and, when the user clicks and/or zooms in on a region the database is consulted to determine the focus settings and the database focus setting is automatically applied to the camera. If the selected objected is slightly out of focus the user will then adjust the focus setting manually. When the user manually adjusts the focus setting the region of the object and/or the appropriate focus setting are added to the database.
  • the pan position, tilt position, and field of view angle may vary slightly from time to time, even though the user is designating the same object.
  • the present invention uses regions, rather than pixels, to determine if the user has selected the same target.
  • the database therefore consists of a tree of regions.
  • a region is defined as a viewing area seen by a camera and is identified by a polar coordinate system which specifies a pan position, a tilt position, and a camera field of view angle.
  • Figures 10A-10C are illustrations of the relationship between regions. Two regions are considered to match, or be the same region, if the intersection of the regions contains a certain percentage of each region, as shown in Fig. 10 A. In the preferred embodiment, this percentage is programmable and the default setting is 80%.
  • a parent region is a region which completely encompasses another region, as shown in Fig. 10B.
  • a parent region may be encompassed within another, larger region, and therefore one parent region may be the child of another parent region, as shown in Fig. IOC.
  • a master parent region which is a parent to all regions, and is the default focus setting. There is no fixed limit on the number of regions that may be stored in the database.
  • FIG. 10 A- IOC illustrate the relationship between fields.
  • Figures HA and 11B which are a flow chart of the camera focusing process of the present invention.
  • Figure HA is entered whenever there is a change in the pan, tilt, zoom or focus settings of the camera.
  • controller 10 determines the polar region based upon the pan position, the tilt position, and the field of view angle (zoom setting).
  • step 203 the focus setting is obtained from the matching polar region in the database and then step 205 is executed. If the polar region is not in the database then, in step 204, the focus setting is obtained for a parent region in the database and the step 205 is executed. It will be appreciated at this point that if there is a matching polar region then the focus setting will be extremely close to the desired focus setting. If there is not a matching polar region then by the use of parent regions, a focus setting is obtained which may be adequate or which will allow the user to easily fine tune the focus setting.
  • controller 10 sends signals to converter 11G and control node 17 to adjust the focus of the selected camera.
  • step 215 the start time for that focus setting is recorded. This start time is used in step 215 below.
  • Decision 206 determines whether a new region has been selected, such as by point and click, draw-and-release, manual controls, etc. If so then a return is made to step 201. If not then decision 207 tests whether the user has adjusted the focus since it was set in step 205. If not then a return is made to decision 206. If the user has adjusted the focus then, in step 210, controller 10 sends signals which cause the focus to be adjusted according to the user's instructions and records the focus setting start time. In decision 211 controller 10 determines whether the current polar region is in the database.
  • controller 10 adjusts the focus setting in the database to correspond to the focus setting actually selected by the user and then returns to decision 206.
  • the focus for a particular polar region is made to conform to the user's particular desires.
  • decision 213 tests whether the database is full. If not then controller 10 adds the new polar region and the focus settings to the database and returns to decision 206. However, if the database is full then, in step 215, controller 10 searches the database for the least important region and discards that region. In the preferred embodiment, the least recently used region is deemed to be the least important region and is discarded.
  • controller 10 adds the new region and focus setting to the database in step 214. Therefore, by the above process, the camera is automatically focused on the target selected by the user and, if the selected focus setting is unsatisfactory to the user and the user adjusts the focus setting then the user's selected focus setting is stored for use and is used the next time that the user selects that region.
  • Camera Construction Figure 12A is an illustration of the preferred embodiment of a camera 150 of the present invention.
  • Camera 150 has a camera body 235, a focusing lens system 230, a zoom/field of view lens system 231 , a panning system 232, a tilt system 233, and a camera base 234.
  • the design of focusing systems, zoom systems, panning systems, and tilt systems, and cameras themselves, are well known in the art.
  • the systems provide feedback to controller 10 so that controller 10 can evaluate the response of the system to the instruction sent.
  • FIG. 12B is an illustration of the feedback system associated with the camera controls: systems 230-233, and control node 17.
  • a feedback unit which is part of systems 230-233, comprises a drive motor 240, a drive shaft 241, a position sensing means 242, and a drive train 243.
  • Position sensing means 242 may be a variable resistor, a potentiometer, a digital shaft position encoder, etc.
  • Drive train 243 drives the appropriate focusing, zooming, panning or tilting function.
  • Systems 230-233 are connected to camera unit control node 17.
  • Control node 17 comprises control logic 250, a motor power supply 251, and a position-to-digital converter 252. Assume that the user indicates that a selected camera should pan to the left.
  • Controller 10 will send the appropriate instructions to converter 11G which, in turn, will transfer the instructions to control logic 250 of control node 17.
  • Control logic 250 will, in turn, cause motor power supply 251 to apply the appropriate voltage to motor 240 to cause motor 240 to turn in the direction which, via drive shaft 241 and drive train 243, causes camera 150 to pan to the left.
  • the position-to-digital converter 252 converts the change in resistance to digital signals and provides these digital signals to control logic 250.
  • control logic 250 may close the loop and control motor power supply 251 so as to achieve the pan position specified by controller 10.
  • control logic 250 sends the current pan position back to controller 10 and controller 10 determines whether the camera 150 has reached the desired position.
  • control of motor 240 may be effected by the voltage, the pulse width, and/or the polarity of the voltage provided by motor power supply 251, which is controlled by control logic 250.
  • Position-to-digital converter 252 may directly measure the resistance of a potentiometer in position sensing means 242, may apply a voltage across a potentiometer in position sensing means 242 and measure the output voltage from the potentiometer, or use other means, such as digital shaft position encoding techniques.
  • the means of sensing the position is not critical but should be accurate enough to provide the degree of control necessary to satisfy the user. In the preferred embodiment, a pan position resolution of 0.1 degrees, a tilt position resolution of 0.1 degrees, and a field of view resolution of 0.03 degrees is used.
  • the position sensing mechanism 242 may be a factory installed part of a system 230-233 or may be a retrofit.
  • a camera 150 is a Hitachi CCD color camera model KB-C550, manufactured by Hitachi Denshi America, Woodbury, New York, and the lens is a Rainbow Automatic Iris electrically driven zoom lens model H6X8MEA-II, manufactured by International Space Optics, Huntington Beach, California.
  • Figure 1 illustrates only a single camera unit control node 17. However, in the preferred embodiment, there is a separate camera unit control node 17 and a separate converter 11G associated with each camera so that a camera 150 may be attached or removed from the system by connecting and disconnecting a minimum number of wires and cables.
  • Figure 12B illustrates a separate motor power supply 251, position-to-digital converter 252, and control logic 250 for each system 230-233
  • the present invention is not so limited. If the motors 240 for the different systems 230-233 are of a similar type then a single motor power supply 251 may be used to control all the motors. Further, the changing of a setting, such as pan, tilt, focus and zoom, occurs at a relatively slow rate compared with other system operations. Therefore, it is possible to multiplex the outputs of several position sensing means 242 into a single position-to-digital converter 252, thereby reducing costs.
  • Control logic 250 selects the appropriate position sensing means 242 in accordance with the motor 240 of the system 230 that is being driven and needs to be monitored. In this manner, a single control logic circuit 250, motor power supply 251, and position-to-digital converter 252, combined with a multiplexer (not shown), may be employed to service two or more systems 230-233.
  • FIG. 13 is an illustration of a two-monitor videoconferencing system of the present invention.
  • monitors 21 A which depicts the scene seen by the local camera
  • monitor 2 IB which depicts the scene seen by the remote camera.
  • the local camera is showing a desk 300 with two persons 301 and 302, one of which is typically the user.
  • Monitor 2 IB shows the remote scene which has a person 304 sitting at a desk or table 303.
  • Monitor 21A also shows a control bar 270. It will be noted that person 304 is not centered in the display on monitor 2 IB but that the user wishes person 304 to be centered.
  • the user will use mouse 12 to move cursor 280 to control bar 270 and pull down a camera selection menu 271.
  • the menu will pull down by simply moving the cursor over the appropriate position on the control bar and, in another embodiment, the menu will be pulled down if the user positions the pointer over the appropriate position on the control bar and depresses or clicks a button 12 A, 12B on mouse 12.
  • Methods for pulling down menus are well known in the personal computer field.
  • Camera menu 271 lists the available cameras such as a local camera, a remote camera, and a graphics camera. In this case the user wishes to select the remote camera so the user will click on the appropriate spot 272 of menu 271 to select the remote camera. This will cause a second menu 273 to pull down listing the functions that can be performed with that camera, such as pan left/right, tilt up/down, zoom in/out, and focus.
  • the user wishes to move person 304 to the center of monitor 2 IB and decides to first pan the camera to center 304.
  • the user will therefore select the panning function 274.
  • This will cause a pan control icon 275 to appear on monitor 2 IB.
  • Icon 275 shows arrows to allow the user to specify movement of the camera to the right 276 or to the left 277.
  • the user will therefore position pointer 280 over the appropriate arrow and click and hold a mouse button 12A or 12B until the desired position of person 304 has been achieved.
  • the user can go back to menu 273 to select tilt and adjust the tilt position as desired, as well as the zoom and focus.
  • the user could simply use point-and-click technique described above.
  • control bar 270 and menus 271 and 273 are show in monitor 21 A and the icon 275 is shown in monitor 2 IB it will be appreciated that this is merely a design choice and that the control bar, menus, and icons may be displayed on either monitor and, if desired, can be moved, using control bar 270, from one monitor to the other.
  • Mouse 12 is preferably used to move pointer 280 between the displays of monitors 21 A and 2 IB. The movement of a cursor or pointer between screens is well known in the personal computer field.
  • controller 10 also supports operation with picture-within-picture, split-screen, and four-quadrant picture operation. In these cases controller 10 controls, and therefore knows, the switching point between one picture and the next and therefore is able to determine whether the pointer is over a scene controlled by a first camera, a second camera, or even a remote camera.
  • Monitor 2 IB illustrates a picture 281 within the broader picture illustrated. In this illustration, picture 281 is a view of a graph 282. The user could therefore position cursor 280 over picture 281 and controller 10 would know that the subsequent user actions were directed to picture 281 and not directed to the larger picture depicting user 304. If the picture 281 were being generated by a remote camera then controller 10 would send network standard signals corresponding to the desired action to the remote controller, which would cause the remote camera to take the desired action.
  • the source of the picture 281 may be any camera which is selectable.
  • the video unit control node 15 is programmed by controller 10 to dynamically connect the appropriate video signals between the appropriate devices so that picture-within-picture and other types of pictures may be obtained. Methods for achieving various multiple picture presentations are well known in the television broadcasting field.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Studio Devices (AREA)

Abstract

A videoconferencing system which readily accommodates new devices and new features. A plurality of devices (10, 12-18, 20-22) are connected to a network (23) by a plurality of network converters (11). A controller (10) contains software pertinent to each type of converter and the device to which it is connected. The controller (10) loads the appropriate software into each converter (11) so that each converter is then configured for the device (10, 12-18, 20-22) to which it is connected. User instructions are provided by a mouse (12), a control panel (13), or a joystick (18) to the controller (10). The controller (10) interprets the instructions to determine the action required and sends an appropriate command to the selected device. The present invention also provides a method for automatically positioning the camera, automatically zooming and focusing the camera, and for automatically adjusting the pan and tilt rates, and the zoom and focus during pan and tilt operations.

Description

ADAPTIVE VIDEOCONFERENCING SYSTEM
Technical Field
The present invention relates to videoconferencing systems and more particularly to a videoconferencing system which can accommodate a plurality of different devices and which provides for ease of operation by the user.
Background of the Invention
Typical prior art videoconferencing systems fall into one of two categories: those where the intelligence is centralized in the coder-decoder (codec) or a system control unit; and those where the intelligence is distributed so that each peripheral device controller has the intelligence necessary to directly control other peripheral devices in the system. One shortcoming of centralized intelligence systems is that such systems are not readily adaptable to accommodate new devices and new versions of existing devices. The addition of another peripheral device beyond the number originally planned for, or the addition of a new type of peripheral device, can require a substantial investment in time and money to accommodate the desired additional device or new device. Furthermore, most centralized intelligence systems have a limited capacity with respect to the number of ports available to connect to peripheral devices. Once this capacity has been reached, new devices can be added only by removing existing devices, such as lesser used devices, or by obtaining another codec or system controller which can accommodate the increased number of devices. Distributed intelligence systems, such as that shown in U.S. Patent No.
5,218,627 to Corey, have the shortcoming in that each peripheral device controller must have the intelligence necessary to control every type of peripheral device connected to the network, and every additional peripheral device must have a peripheral device controller which has the intelligence necessary to control all the existing devices on the network. Therefore, the addition of a new type of peripheral device requires new programming to be provided for each of the existing peripheral device controllers, and requires programming of the controller for the new type of device to accommodate the existing peripheral devices.
Therefore, there is a need for a videoconferencing system which can readily accommodate both additional peripheral devices and new types of peripheral devices.
Positioning of video cameras is required for videoconferencing as well as for a number of other activities, such as surveillance. The terms pan, tilt, zoom and focus are industry standards which define the four major axes for which a camera may be adjusted. Traditional camera positioning provides for manual adjustment of these axes, as well as buttons which provide for automatically positioning the camera to a preset location. A preset function recalls the pan, tilt, zoom and focus settings that have been previously ascertained and stored for that preset location.
Traditional videoconferencing systems provide for rather rudimentary control of these camera functions. That is, the user has a control panel for manually controlling camera functions, such as buttons for up/down, left/right, zoom in/out, and focus. The user can also typically select one of several preset camera settings so that, by the press of a single button, the camera will automatically position and focus itself at some preselected target. Of course, the preset function requires planning because the camera must be manually adjusted for the preset, and then the settings stored. The preset button then merely recalls these settings and adjusts the camera accordingly. If a location has not been preset then the user must manually adjust the pan, tilt, zoom, and focus settings for that location. However, these controls are not intuitively obvious or easy to use, partly because the user may think that the camera should pan in one direction to center an object whereas, because of the position of the camera with respect to the user and the object, which object may be the user, the camera should actually move in the opposite direction. For example, the user typically sits at a table and faces the camera, and beside the camera is a monitor screen which allows the user to see the picture that the camera is capturing. If the user is centered in the picture, and wishes the camera to center on his right shoulder, the user may think that he wants the camera to pan left because, on the screen as seen by the user, the user's right shoulder is to the left of the user's center. However, the camera should actually pan to the right because, from the camera's viewpoint, the user's right shoulder is to the right of the user's center.
Also, current manual camera positioning techniques typically use a fixed motor speed. This results in the panning being too rapid and the scene flying by when the camera is zoomed in on an object, or in the panning being too slow and the scene taking a prolonged time to change to the desired location when the camera is in a wide field of view setting (zoomed out).
Furthermore, in traditional videoconferencing systems, when the camera is moving from to a preset location the pan and tilt systems move at the same rate. If the required pan movement is different than the required tilt movement then the camera will have completed its movement along one axis before it has completed its movement along the other axis. This makes the camera movement appear to be jerky and unnatural.
After the user has completed the process of changing the camera position the user may have to refocus the camera. As chance would have it, the first attempt to refocus the camera usually is in the wrong direction. That is, the user inadvertently defocuses the camera. The learning process is short, but the need to focus creates delays and frustration.
When the system has multiple cameras which are subject to control by the user, typical systems require the user to use buttons on the control keyboard to manually select the camera to be controlled, and/or assigning separate keys to separate cameras. Frequently, the user will select the wrong camera, or adjust the wrong camera.
Summary of the Invention The present invention provides a video teleconferencing system which combines a central intelligence with distributed intelligence to provide a versatile, adaptable system. The system comprises a controller and a plurality of network converters. Each network converter is connected to a system network as well as to one or more peripheral devices. The controller contains the software necessary for its own operation as well as the operation of each of the network converters. The user selects the type of device that is connected to a network converter and the controller sends the software appropriate to that type of device to the network converter. The network converter loads the software into its own memory and is thereby configured for operation with that type of device. This allows a network converter to be quickly programmed for a particular peripheral device. This also allows for quick and convenient upgrading of the system to accommodate new devices. Rather than having to design a new network converter for each type of new peripheral device, software for that new device is written and stored in the controller. The software can then be loaded into a network converter when that new device is added to the system. Therefore, existing network converters can be used to accommodate new devices. This reduces the number and type of network converters that must be maintained in inventory and also minimizes the obsolescence of network converters as new devices and new versions of existing devices become available. In addition, the present invention provides that the controller will perform conversion of instructions from the initiating device, such as a mouse, to the controlled device, such as a camera. This allows for easy and convenient upgrading of the system to accommodate new devices because the peripheral devices do not need to understand the signals from other peripheral devices. The controller performs the necessary device-to-device signal translation. For example, one network controller will convert signals from a mouse into network standard control signals which represent the mouse movement, such as left, right, up, down, button 1 depressed, button 1 released, etc., regardless of the type of mouse being used. The controller then inspects these network standard control signals to determine the type of action requested by the user. The controller then generates network standard control signals corresponding to the desired action and places these signals onto the network. Examples of network standard control signals intended for the control of a camera might be pan left, pan right, etc. The camera network converter then performs a conversion of the network standard signals from the controller into the type of control signals required for that particular camera, such as +12 volts, -12 volts, binary command 0110, etc. When a new type of peripheral device, such as a new camera, is added the new device may require control signals which are completely different than any existing device so the control signals presently provided by the camera network converter would not give the desired results. In the present invention the network standard signals do not change. Rather, new software is written for the camera network converter so that the camera network converter provides the appropriate signals to the new camera, such as +7 volts, -3 volts, binary command 100110, etc. In this manner, peripheral devices from different manufacturers and new peripheral devices are readily accommodated by adding new software for the controller. The user can then instruct the controller to load the new software into the converter so that the converter is now configured for the new device.
The present invention also provides for control of devices on remote systems. The use of network standard signals allows a user at a local site to easily control a device at a remote site, even if the controller at the local site does not have software appropriate for that type of device. The controller at the local site receives the network standard signals corresponding to the action taken by the user and determines the action (pan left, pan right, etc.) required at the remote site. The local controller then sends the network standard signals for the action to the remote controller. The remote controller receives the network standard signals from the local controller and sends these network standard signals to the remote network converter for the device, and the remote network converter does have the appropriate software for the remote device. The remote network converter then converts the network standard signals into the signals appropriate for that type of peripheral device.
The present invention provides alternative methods of adjusting the pan, tilt, zoom and focus of a camera. In one method the user positions a pointer over an object displayed on a monitor and clicks a mouse button. This causes the camera to be automatically positioned so as to center the object in the monitor display. In another method the user uses the pointer to draw a rectangle around the object or area of interest. This causes the camera to be automatically positioned to center the object in the monitor display and adjust the zoom and focus so that the designated area in the rectangle fills the display. This is a substantial improvement over prior art systems in that a camera may be automatically positioned for objects or areas for which there are no preset values.
The present invention provides an improvement to panning. The panning speed is automatically adjusted in accordance with the current zoom (field of view) setting. When the camera is zoomed in, panning will occur at a slow rate so that objects do not fly by at high speed. When the camera is zoomed out, panning will occur at a fast rate so that objects do not crawl by at slow speed. The result is that, regardless of the zoom setting, objects appear to move across the scene at a fixed, comfortable rate, which is user selectable.
The present invention provides an improvement to panning and tilting the camera. When the camera position is to be changed, the time to complete the change in the pan position is determined and the time to complete the change in the tilt position is determined. Then, the faster process is slowed down so as to be completed at the same time as the slower process. This causes the camera to move smoothly and linearly from the starting position to the ending position.
The present invention provides a method for automatically focusing the camera. Each time that the camera is positioned toward and manually focused on an object or area the system automatically stores the camera position and the focus setting. When the camera is next positioned toward the object or area the system automatically recalls the stored focus setting and implements that setting. The present invention defines relationships between regions so that a focus setting may be determined even if that region has not been used before.
The present invention further provides for automatic selection of the camera to be controlled. The user simply positions a pointer over the desired scene and the system automatically selects, for further control, the camera which is providing that scene. This method is particularly useful when picture-within-picture, split screen, and four-quadrant screen displays are in use.
Brief Description of the Drawing
Figure 1 is a block diagram of the preferred embodiment of the present invention.
Figure 2 is a block diagram of a serial interface-type network converter.
Figure 3 is a block diagram of a parallel interface-type network converter.
Figure 4 is a block diagram of a specialized-type network converter.
Figures 5A and 5B are a flow chart of the method used for positioning a camera.
Figures 6A and 6B are an illustration of the operation of the automatic zoom feature of the present invention. Figure 7 is a flow chart of the method for controlling the aim point and the zoom operation of the camera.
Figure 8 is a schematic block diagram of a video unit control node.
Figure 9 is a schematic block diagram of an audio unit control node. Figures 10A-10C are illustrations of the relationship between regions.
Figures 11A and 1 IB are a flow chart of the camera focusing process.
Figure 12A is an illustration of the preferred embodiment of a camera of the present invention.
Figure 12B is an illustration of the feedback system associated with the camera controls.
Figure 13 is an illustration of a two-monitor videoconferencing system of the present invention.
Detailed Description Turning now to the drawings, in which like numerals reference like components throughout the several figures, the preferred embodiment of the present invention will be described.
System Overview Figure 1 is a block diagram of the preferred embodiment of the present invention. The videoconferencing system comprises a controller 10, a plurality of network converters (C) 11 A- UK connected to a network 23, a mouse 12, a control panel 13, an audio unit control node 14, a video unit control node 15, a coder-decoder (codec) 16, a camera unit control node 17, a joystick 18, a power supply 19, a video cassette recorder/playback unit (VCR) 20, monitors 21, and a modem 22. The video teleconferencing system also comprises items which, for the sake of clarity, are not shown in Figure 1, such as: cameras, pan/tilt and zoom/focus units for the cameras, microphones, speakers, audio cabling, video cabling, and telephone and power wiring. Each device 10, 12-22 is connected to a converter 11A-11K. The converters are connected, preferably in a daisy-chain (serial) manner, via the network designated generally as 23. Converter 11A is shown as part of controller 10, and converters 11B-11K are shown as being stand alone components which are separate from their respective connected devices 12- 22. However, this is merely a preference and any converter 11 may be a stand alone component or may be a part of its associated device. In the preferred embodiment, the network 23 is the LON-based network developed by Echelon, Inc., Palo Alto, California. However, other networks, such as Ethernet, may be used.
Each converter 11 contains information which either converts network standard signals on network 23 into control signals for the connected device 10, 12-22, converts control/status signals for the connected device(s) into network standard signals for network 23, or both. For example, network controller 11B will convert signals from the mouse 12 into network standard control signals which represent the mouse movement, such as left, right, up, down, button 1 depressed, button 1 released, etc. Network converter 11B provides the same network standard control signals for a particular type of mouse movement regardless of the type of mouse being used. In operation, network standard control signals from control devices such as mouse 12, control panel 13, joystick 18 or codec 16, are sent, via converters 11 and network 23, to controller 10. It is also possible for a single converter to service two or more devices, such as converter 11B servicing mouse 12 and joystick 18, and converter 111 servicing two monitors 21 A and 22B. When sending information concerning the user's movement of devices 12 or 18, converter 11B also sends information as to whether the activity is associated with the mouse 12 or the joystick 18. The controller 10 then inspects these network standard control signals to determine the type of action requested by the user and the device which should take the action, generates network standard control signals corresponding to the desired action, and places these signals onto the network 23. As in any network, a converter 11 inspects the address of the incoming network standard signals on the network 23 to determine if the data is intended for that converter or its connected device. If so, then the converter 11 will capture the data, which is a network standard control signal representing the desired action, and convert the data into the appropriate type of signal for the connected device. For example, assume that the user has used the mouse 12 to select a camera (not shown in Figure 1) and has moved the mouse in a direction which indicates that the selected camera should pan to the left. The mouse movement signals are converted by converter 1 IB into network standard control signals indicating, for example, the direction of the movement of the mouse and the status of the buttons on the mouse (depressed, not depressed). Converter 11B then generates an address for controller 10 and places these network standard signals on network 23. Converters 11C-11K ignore these signals because the address indicates that the signals are not for them. Converter 11A recognizes the address as its own, captures these signals, and provides the signals to controller 10. Controller 10 determines that the network standard control signals signify a mouse movement corresponding to an instruction for the selected camera to pan to the left and, accordingly, generates network standard control signals corresponding to such camera movement. Controller 11 then instructs converter 11A to address these signals to the network converter for pan/tilt unit control node 17 and to place these signals on network 23. Converter 11G recognizes the address as its own (or as intended for its connected pan/tilt device), and captures the network standard signals. Converter 11G then generates control signals appropriate for the type of pan mechanism (not shown) used with the selected camera.
Therefore, even if the type of mouse is changed or the type of pan/tilt mechanism is changed, the network standard signals from the mouse or to the pan/tilt mechanism will not change. Rather, the network converters 11 will convert the signals from the mouse 12 into network standard signals and will convert the network standard signals into signals appropriate for the pan/tilt mechanism. As an example, the signals from mouse 12 may indicate that mouse 12 is being moved to the left at a certain rate and the appropriate signals provided to the pan motor may be +12 volts or, if the pan motor has a digital controller or interface, the signals provided by converter 11G may be a binary signal such as 101011 or some other code which corresponds to the code and format required to achieve the specified action.
It will be appreciated that, for a simple action, such as pan left or right and tilt up or down, controller 10 may not be required and converters 11B and 11G may be programmed to achieve the desired correspondence between the movement of the mouse 12, the depression of keys on control panel 13, and movement of the pan motor. However, in the preferred embodiment, mouse 12 is also used to specify functions which do not have a one-to-one correspondence between mouse movement and pan motor action, such as the point-and-click and the draw-and-release operations described below and therefore all network signals are directed to or come from controller 10.
Similarly, status information from monitor control node 21 is addressed by converter 111 to controller 10 (converter 11 A) and then placed on network 23. Controller 10 then inspects the status information to determine if the selected monitor (not shown) is in the proper mode, such as on or off.
Control panel 13 is a conventional videoconferencing system control panel, well known in the art, and provides, via buttons, such functions as pan left, pan right, tilt up, tilt down, mute on/off, zoom in/out, focusing, presettable camera settings, and volume up/down. Audio unit control node 14 controls the flow of audio signals among the devices which send or receive audio signals, such as microphones, speakers, codec 16, telephone lines, and VCR 20. Video unit control node 15 controls routing of video signals among the different devices which send or receive video signals such as codec 16, VCR 20, cameras, and monitors 21. Codec 16 provides conventional codec functions. Camera unit control node 17 controls the pan, tilt, zoom, and focus of the cameras and provides feedback regarding these parameters. Power supply 19 provides operating power for the converters 11 and also for the other devices 10, 12-18, 20-22 connected to the system. VCR 20 is a conventional video cassette recorder/playback device. Monitors 21 are commercially available monitors and, in the preferred embodiment, are Mitsubishi color televisions, model CS-35EX1, available from Mitsubishi Electronics America, Inc., Cypress, California. Modem 22 is a conventional modem, preferably having a data communications rate of at least 9600 bits per second.
Those of skill in the art will appreciate that a typical codec 16 has a port for connection to one or more dial-up or dedicated telephone lines. There are several different protocols which can be used for codec-to-codec communications. If the codecs are using the same protocol then they can negotiate as to what features, such as data transfer rate, data compression algorithms, etc., are to be used in the videoconferencing session. However, the codecs must be configured to use the same protocol or information transfer is not possible. If one codec has been configured to use a first protocol and a second codec has been configured to use a second protocol then the codecs will not be able to communicate. Codecs generally have a keypad and a display which are used for setting up the codec. However, the codes for setting up and the display indicating the stage of setup or the results of the entered code are typically not intuitive. Therefore, setting up (configuring) a codec for a particular protocol is, in most cases, a tedious and time consuming task which is preferably performed by a technician who is familiar with the instruction and result codes used by that codec. However, codecs have a data port which can also be used for transferring data as well as for setting up the codec. This data port is advantageously used in the present invention to allow a codec 16 to be configured by the controller 10. In the preferred embodiment, codec 16 is a type Visualink 5000, manufactured by NEC America, Inc., Hillsboro, Oregon.
Using, for example, the mouse 12 or the control panel 13, the user can instruct controller 10 to establish the videoconferencing session. Controller 10 will, via converters 11A and 11F and network 23, instruct codec 16 to dial up or otherwise access the remote codec (the codec at the other videoconferencing location). Codec 16 will then attempt to establish communications with the remote codec. If communications are successfully established the codecs will negotiate what features will be used and then the session may begin. However, if communications cannot be established, such as because the codecs are configured for different protocols, the local codec 16 will report to controller 10 that codec 16 was able to contact the remote codec but was unable to establish communications (handshake) with the remote codec because the remote codec was using a different protocol. Controller 10 will then, via converters 11A and UN, instruct modem 22 to dial up the remote modem (the modem for the videoconferencing system at the other location). Once controller-to-controller communications have been established via modem then controller 10 can instruct the remote controller to configure the remote codec for a particular protocol. The remote controller will take action, if necessary, to configure the remote codec to the same protocol. Conversely, controller 10 can receive information from and/or negotiate with the remote controller as to the protocol(s) supported by, or the current configuration of, the remote codec and then configure codec 16 to the same protocol as the remote codec. Then, controller 10 can again instruct codec 16 to establish communications with the remote codec and, as both codecs have now been configured to the same protocol, the codecs can establish communications and negotiate features, and the videoconferencing session can begin.
The present invention also provides for local control of remote devices. In addition to controller 10 being able to communicate with any device 12-18, 20-22 on the local network 23, controller 10 may also communicate with a similarly situated controller at a remote site (not shown) via the data port on codec 16. The user, using mouse 12, control panel 13, or joystick 18, may command a particular action to be performed at the remote site, such as panning the remote camera to the left or right, tilting the remote camera up or down, etc. The user's actions are converted into network standard control signals and these signals are sent by converter 1 IB to controller 10. Controller 10 determines the action required at the remote site and sends, via network 23 and codec 16, network standard control signals corresponding to the action to the remote controller. The remote controller then sends, via its own network, the network standard signals to the converter for the remote pan/tilt unit. The remote converter then generates the appropriate instruction for the remote pan/tilt unit control node which, in turn, causes the pan/tilt mechanism for the selected remote camera to perform the action specified by the user at the local site. The user at the local site can therefore control all of the functions of all the devices at the remote site that the remote user can control at the remote site, even if the remote site has devices available which are not available at the local site. However, in practice, some functions at a site are preferably controlled only by the user at that particular site, such as microphone muting, monitor on/off operation, and speaker volume control settings.
The present invention also provides for system diagnostics. In the preferred embodiment, camera unit control node 17, in addition to receiving instructions from controller 10, also reports the results of an instruction to controller 10. Each pan/tilt unit has a position indicator, either as part of the unit or as a retrofit device. The position indicator indicates the current pan position and the current tilt position. The camera unit control node 17 accepts the position signals from the position indicator and provides these signals to the controller 10. Controller 10 inspects these signals to determine whether the selected pan/tilt unit is taking the proper action with respect to the control signals. For example, assume that controller 10 has instructed a particular pan/tilt unit to pan in a certain direction at a certain rate but that the pan/tilt unit either does not pan, or pans at a different rate. The camera unit control node 17 reports the response of the selected pan/tilt unit to controller 10. If the response of the selected pan/tilt unit is improper then controller 10 will cause a report to be generated which alerts the system operator to the problem. The report may be provided in a number of ways. For example, the presence of the report may be indicated by an icon on the screen of a monitor 21. This alerts the system operator to select the report to ascertain the nature of the problem. Or, the controller 10 may cause a report to be printed, either by a printer (not shown) connected to a printer port on controller 10 or by a printer (not shown) connected as another device on the network 23. The report may also indicate the severity of the problem. For example, a slow pan is generally not a critical item, but indicates that the pan/tilt unit should be serviced in the near future to prevent the complete failure of and/or damage to the unit. Conversely, a unit which does not pan at all requires immediate servicing as continued attempts by the user to cause that pan/tilt unit to pan could result in gear damage or motor burnout. Modem 22 also allows for remote diagnostics and reporting. If the videoconferencing system is, for example, being serviced by a remote party then the remote party can, using a personal computer and a modem, call up modem 22, establish communications with controller 10, and instruct controller 10 to send, via modem 22, the current system diagnostics. Furthermore, controller 10 can be programmed to use modem 22 to call up the remote party, establish communications with the remote computer, and automatically send the current system diagnostics. The programming may specify that the call is to be performed at a certain time of day, such as during off-duty hours, or whenever a serious failure occurs, such as the complete failure of a pan/tilt unit, or both.
The controller-to-controller communications, via either codecs or modems, also allows the controller at one site, such as a remote site, to inform the controller at another site, such as the local site, that a particular device or function is inoperative at the remote site. Then, when the user attempts to use that device or function the local controller will disregard the instructions from the user and inform the user that that device or function is out of service.
Controller 10, in addition to performing system diagnostics, also attempts simple system repairs. For example, if the pan/tilt unit will not pan in one direction, controller 10 will instruct the pan/tilt unit to pan in the other direction so as to attempt to dislodge any cable which may be snagged. If this action is successful and the pan/tilt unit is then operational controller 10 will log the failure and the repair so that the service technician will know to inspect that unit for loose or snagged cables and to service that unit. If the action is not successful then controller 10 will disregard future instructions from the user as to the desired movement of that pan/tilt unit and will not attempt to send further instructions with respect to the failed function. That is, pan instructions will not be sent because the pan function is not operative, but tilt instructions may be sent because that function still operates properly. However, as another option, controller 10 may be programmed to cause operating power to be entirely removed from the failed pan/tilt unit.
Similar action and reporting may be taken with respect to other functions and devices. For example, the camera unit control node 17 also controls the zoom and focus of the connected cameras (not shown). In the preferred embodiment, the cameras have a zoom position indicator and a focus position indicator, either as part of the unit or as a retrofit device. Controller 10 can therefore determine whether a selected camera is operating properly. Also, each monitor 21 has an on/off indicator, described below, and converter 111 reports the status of each monitor. Controller 10 can therefore determine whether a selected monitor is on or off. Also, codec 16 performs limited self-diagnostics on its own operation. Controller 10, either in response to an error signal from codec 16, or at periodic intervals, will instruct codec 16 to report its status. Controller 10 can then take the appropriate reporting action, if any is required, and/or switch to another codec (not shown) connected to network 23. In the preferred embodiment of the present invention the LON network is used because converters 11, in general, draw operating power via the network 23 and do not require a separate source of power nor require power from the connected device. This is advantageous in that the network and the system will continue to function even if a connected device, such as VCR 20 or modem 22, is removed from the network or is powered down. In the preferred embodiment, a power supply 19 is connected to the network 23 and provides operating power for the converters 11. Power supply 19 also provides operating power, such as 110 VAC or 12 VDC, to each peripheral device. This operating power may be provided via network 23 or provided via separate power cables to each peripheral device. Power supply 19 provides AC and DC power, as required, to each peripheral device. Power supply 19 is connected to converter UK and may therefore be controlled by the user. This allows the user to turn on and turn off selected peripheral devices, as desired, by removing operating power from the device. This provides an additional way of turning off a device if the device is otherwise non-responsive to signals sent via network 23, and also provides a safety factor in that the user can completely remove operating power from a device. Further, in the preferred embodiment, converter UK has an internal timer. If there is no user activity, signified by a lack of activity of mouse 12, control panel 13, or joystick 18, then converter UK will send a "sleep" signal to controller 10. This causes controller 10 to go into a standby mode, thereby conserving power. Converter UK will also instruct power supply 19 to remove operating power from the peripheral devices. Although converter UK and power supply 19 are shown as separate devices, it will be appreciated that both functions may be performed by a single device. In an alternative embodiment, power supply 19 is not responsive to signals on network 23 but merely provides operating power for the converters 11. In this embodiment either controller 10 or converter UK may have the internal timer. In another alternative embodiment, power supply 19 is not used and controller 10 has the internal timer, and also provides operating power for the converters 11 on network 23 via the connection to converter 11 A.
In the preferred embodiment, controller 10 is a personal computer, such as a COMPAC Prolinea, having a 120 megabyte hard drive, a 4 megabyte random access memory, and a 3-1/2-inch floppy disk drive. Controller 10 does not need to have a screen or a keyboard because, in the preferred embodiment, a monitor 21 is used as a screen, and mouse 12 and control panel 13 may be used in place of a keyboard. However, if desired, a screen and a keyboard could be connected directly to controller 10. Also, even though mouse 12, joystick 18, and modem 22 are shown as being connected to converters 11 on network 23, it will be appreciated that the converters associated with these devices may be dispensed with if card slots for controlling these devices are available in controller 10 and the distance between the device and controller 10 is not excessive.
Also, even though only one mouse 12, codec 16, joystick 18, VCR 20, and modem 22 are shown it will be appreciated that the present invention is not so limited and a plurality of each type of device may, if desired or necessary, be connected to network 23.
In addition, even though mouse 12, control panel 13, and joystick 18 are shown as being connected to converters 11B and 11C by wiring, it will be appreciated that there are commercially available devices 12, 13, and 18 which do not have a wire connection but, instead, communicate by infrared(IR) signals. These devices may also be used with the present invention. In this case the appropriate network converter 11 would have an IR receiver, would respond to the infrared signals, and would provide the corresponding network standard signals to controller 10. Converter 111 would then be a specialized purpose converter. A specialized purpose converter is described below which transmits IR signals to IR receivers in monitors 21. In this case, the role of transmitter and receiver is reversed, that is, the devices 12, 13, 18 transmit and the converters 11B, 11C receive.
Network Converters
Converters 11 fall into three general classes: serial interface, parallel interface, and specialized purpose. Typically, a codec 16 is a serial interface device and therefore converter 11F would be a serial interface- type converter, whereas a VCR 20 may have a parallel interface and therefore converter 11H would be a parallel interface-type converter. In the preferred embodiment, monitors 21 are of the type which can be remotely controlled by, for example, a handheld infrared remote control. Converter 111 is therefore a specialized type of converter in that it can provide the infrared signals necessary to control the monitors 21 and has the necessary components for monitoring the state of operation of the monitors 21. Figure 2 is a block diagram of a serial interface-type network converter 11. A serial-type converter 11 comprises a network connector 40, a power supply/filtering circuit 41, an RS-485 transceiver 42, a parallel- serial and serial-parallel (P/S-S/P) converter 48, a microprocessor 43, a basic program memory 44, an installed program memory 45, a set-up button 46, a display 47, an RS-232 charge pump/transceiver 50, and a serial port connector 51. Connector 40 is connected to network 23 and connector 51 is connected to a serial interface device, such as codec 16. Power supply/filtering circuit 41 draws power from network 23 and provides filtered power to the several circuits of converter 11. Transceiver 42 provides voltage level, balanced- to-single-sided (unbalanced), and single- sided-to-balanced conversion of the signals between network 23 and P/S-S P converter 48. P/S-S/P converter 48 provides parallel-serial and serial- parallel conversion of the signals between transceiver 42 and the microprocessor 43. In the preferred embodiment, microprocessor 43 is a Neuron microprocessor, manufactured by Motorola Semiconductor Products, Phoenix, Arizona and the P/S-S/P conversion functions of converter 48 are performed by the microprocessor 43. Basic program memory 44 contains an identification number, such as a serial number, start-up procedures and basic operating instructions for microprocessor 43, such as instructing microprocessor 43 of the port or address of transceivers 42 and 50, button 46 and display 47. In the preferred embodiment, memory 44 is a programmable read only memory (PROM). Installed program memory 45 contains configuration information and operating instructions as to the conversion required between signals present on network 23 and the corresponding signals to be output via connector 51 , and vice versa. Examples of the type of information that may be installed in memory 45 are the voltage polarity and voltage levels required to control the connected peripheral device, the binary codes and format required to control the connected peripheral device, and similar information concerning signals that may be received from the connected peripheral device. In the preferred embodiment, memory 45 comprises both an electrically erasable programmable read only memory (EEPROM) and a random access memory (RAM). Button 46 is used to initialize (set up) converter 11, as explained in more detail below. Display 47 is, in the preferred embodiment, a light emitting diode (LED) and is off when microprocessor 43 has been properly set up (configured), and flashes when microprocessor 43 is in the set up mode (not configured).
In the preferred embodiment, controller 10 contains, in its memory (not shown), a plurality of programs for the operation of converters 11. There is a separate program for each type of device that may be connected to a converter. Converters 11F and 11 J are both serial interface-type converters. However, one is connected to codec 16 and the other is connected to modem 22, and therefore each requires different operating instructions so as to properly interface with the connected device. Therefore, for each type of converter, there is a separate program for each type of device which may be connected to that converter. A program, which may include software, firmware, data, formats, conversion codes, etc., is downloaded from controller 10 to the selected converter 11 so as to properly configure the converter 11 for the type (serial, parallel, specialized) of converter that it is and also for the type of device with which it will be connected. This provides flexibility in that if a new type of device is to be connected to the network then a program is written for that type of device and loaded into controller 10. Controller 10 then downloads the program to the converter 11 which is connected to that new type of device. Therefore, in general, a serial interface-type converter can be used with any serial interface device by simply downloading the appropriate serial interface program from controller 10 into that converter 11, and likewise for parallel interface-type devices. Also, additional devices can be easily supported by using the appropriate generic (serial-type or parallel-type) converters and then causing controller 10 to download the appropriate programs to each of the added converters. This reduces the inventory of different types of converters that the user must have on hand to repair or add to the system. In the preferred embodiment, memory 45 in a serial-type converter
11 is not programmed with the installed program at manufacture, although it could be so programmed if desired. Therefore, when a converter 11 is first installed in the videoconferencing system and power is applied, the converter 11 will not be configured. Furthermore, if the user changes the type of serial device connected to the converter 11, such as disconnecting converter 11 from codec 16 and connecting converter 11 to modem 22, then converter 11 will be improperly configured for the newly connected device. Therefore, the user will press set up button 46, which causes microprocessor 43 to cause display 47 to begin blinking. Also, microprocessor 43 will send its identification number and type to controller 10 along with a network standard signal which advises controller 10 that converter 11 needs to be configured.
The user will then go to controller 10 and, preferably using mouse 12, pull down an initial set up menu (not shown). The set up menu will list the last converter 11 which has reported a need to be configured. Then, the user will pull down another menu which lists the types of serial interface devices supported by the videoconferencing system. Once the connected serial device is identified by the user controller 10 will download, via network 23, the program necessary to allow converter 11 to interface between network 23 and the connected serial device. Microprocessor 43 will install this program in the installed program memory 45. Microprocessor 43 and memories 44 and 45 are shown as separate components for clarity but may be in a single device. If converter 11B has not been previously configured then a mouse, such as mouse 12, may be connected to a mouse control port on controller 10 in order to configure converter 11B. Thereafter, the remaining converters may be configured using either mouse 12 or the mouse connected directly to controller 10. Figure 3 is a block diagram of a parallel-interface type network converter 11. A parallel-type converter 11 is similar to that of a serial-type converter except that, instead of transceiver 50 and connector 51 , converter 11 will have an output transceiver 54 and a parallel connector 57. Output transceiver 54 comprises output drivers 55 and input buffers 56. Preferably, transceiver 54 provides isolation between microprocessor 43 and the parallel interface device. Also, device 54 is preferably configurable by microprocessor 43 to select which pins on connector 57 are output pins and which pins are input pins. Devices which perform, or can be readily connected to perform, the functions of transceiver 54 are well known in the art. In the preferred embodiment, the functions of transceiver 54 are performed by the indicated Neuron microprocessor 43. The operation of a parallel-type converter 11 is identical to that of a serial-type converter except that the inputs and outputs on connector 57 are configured for a device which is a parallel interface device, such as VCR 20.
Figure 4 is a block diagram of a specialized-type network converter, such as converter 111. A specialized converter is useful in cases where the connected device does not have a serial or parallel interface or where that interface is already in use for some other purpose, but where there are also other means of controlling the device, such as by infrared signals or voltage level and/or polarity signals (analog signals). Converter 111, which interfaces with monitors 21, is an example of a specialized converter. Like the serial-type and parallel-type converters, a specialized-type converter has a connector 40 for connection to the network 23, a power supply/filtering circuit 41, an RS-485 transceiver 42, a microprocessor 43, a basic program memory 44, an installed program memory 45, a set up button 46, and a display 47. In addition, specialized converter 11 has a driver 61, which is capable of driving infrared (IR) LEDs 62A and 62B. Only two IR LEDs are shown, corresponding to two monitors 21, for convenience, but more monitors 21 may be used. Each monitor 21 is, in the preferred embodiment, controllable by the use of infrared signals and has an infrared detector built into the monitor 21. This type of monitor is well known in the art. An IR LED, such as 62A, is positioned in front of the infrared detector on the monitor 21 so that microprocessor 43 can send signals to driver 61, which provides the signals to the LED 62 A, which emits the infrared signals appropriate to cause monitor 21 to perform a particular action, such as turning on or off, turning the volume up or down if the speaker in monitor 21 is being used, adjusting brightness, contrast, etc.
In addition, a coil, such as coils 63 A and 63B, is attached to each monitor 21. A coil 63 is used to pick up the magnetic field of the horizontal deflection coils present in a monitor 21. Coils 63 A and 63B are connected to amplifier/detectors 64 A and 64B, respectively. An amplifier/detector 64 amplifies the signal provided by a coil 63 and detects (rectifies) the signal. The output of each amplifier 64 is connected to buffer 65, which is connected to microprocessor 43. Buffer 65 provides any necessary buffering and voltage level shifting between the output of amplifier/detector 64 and microprocessor 43. The on/off control signal in many monitors 21 is the same signal and the monitor 21 merely toggles between an on state and an off state. To determine whether a monitor 21 is on or off a coil 63 is attached to the monitor 21 to pick up the radiation emitted by the horizontal deflection coil in that monitor 21. If the user sends an instruction to turn on a monitor 21 the microprocessor 43 will inspect the output of buffer 65 to determine if the coil 63 and amplifier/detector 64 associated with that particular monitor 21 are detecting radiation. If radiation is being detected then the monitor is already on and microprocessor 43 will not take any action. However, if monitor 21 is off then radiation will not be detected and, consequently, microprocessor 43 will cause driver 61 to pulse an LED 62 with the code required to toggle the on/off control of that monitor 21. Microprocessor 43 will then check the output from the coil 63 to determine if the operation was successful. If the operation was successful then microprocessor 43 will take no further action. However, if monitor 21 does not turn on then microprocessor 43 will attempt several more times to turn on the monitor 21. If, after several attempts, the monitor 21 is still not on then microprocessor 43 will report the failure to controller 10. In the preferred embodiment, coils 63 are a type 70F103AI, manufactured by J. W. Millen, Rancho Dominguez, California. The positioning of the coils 63 on the monitors 21 is not extremely critical but it is preferred to place the coils 63 in a position to receive the maximum pick up when a monitors 21 is on so that the reliability of the on off indication is consistently high.
If a converter 11 is only to be used with a certain type of monitor then the basic program memory 44 may contain the necessary IR transmit instructions, and so install program memory 45, set-up button 46, and display 47 will not be needed. However, if converter 11 may be used with different types of monitors then the necessary instructions for the several types of monitors may be included in basic program memory 44 or, alternatively, the type of monitor being used may be selected from a pull¬ down menu at controller 10 and the necessary IR transmit program downloaded from controller 10 in memory 45.
Camera Positioning In practice, many of the tests and/or functions shown in the figures are performed by programs or subroutines which are simultaneously active so that one test and/or function may be performed concurrently with another test and/or function. That is, tests for mouse movement, mouse button depression/release, joystick movement, control panel selections, etc., are performed continuously or may be interrupt driven functions. However, for clarity of illustrating the operation of the present invention, flowcharts are used.
Figures 5A and 5B are a flow chart of the method used for positioning a camera. In the preferred embodiment, the mouse 12 or the joystick 18 may be used to move a pointer within the display presented on a monitor, such as monitor 21A. For convenience, only the operation using mouse 12 will be discussed although it will be appreciated that joystick 18, with control buttons thereon, can be used to accomplish the same result. This particular method of positioning the camera is referred to herein as "point- and-click". This phrase describes the action required by the user to reposition the camera. That is, using mouse 12, the user causes the pointer to be positioned (pointed) over the target of interest and then clicks a button on mouse 12. Controller 10 then causes the selected camera to be aimed at the selected point so that the selected point is nominally in the center of the screen display seen by the user. This allows the user to quickly and easily designate where a selected camera should be pointing so that the user can conveniently view the desired object(s). It should be noted that this method is useful for both local cameras, that is, cameras which are at the same site as the user, and for remote cameras, that is, cameras which are at the remote site. Therefore, the user can easily adjust the remote camera to point at a desired object. This allows the user to focus a camera on a target of interest without having to instruct the person at the other end to stop whatever he or she is doing and position the camera as desired by the user.
This procedure is preferably implemented by controller 10. A starting step 100 is shown but it will be appreciated that controller 10 performs many operations and therefore a starting step should be understood to be an entry point into a subroutine, such as a subroutine used for camera positioning. In decision 101 a test is made as to whether any mouse button 12 A, 12B is depressed. If so then the user is indicating that some function other than point-and-click camera positioning is to be performed and therefore other functions are tested and/or performed in step 102. If no mouse buttons are depressed then, in decision 103, a test is made for movement of the mouse. If there is no mouse movement then a return is made to decision 101. If there is mouse movement then decision 104 tests whether the pointer displayed on the screen of monitor 21 A is outside the area of the monitor designated for the picture. That is, is the pointer now positioned over a control bar, selection icon, other function symbol, a different picture (picture- within-picture), or a different monitor. If the pointer is outside the picture area then the user is indicating that other functions are to be performed and controller 10 proceeds to step 102 to perform the other functions. If the pointer is within the picture area then decision 105 tests whether a mouse button, such as mouse button 12 A, has been clicked. If not then a return is made to decision 101. If so then controller 10 determines in step 106 the amount of pan and tilt required to achieve the user's request. This is determined by measuring the click position of the mouse with respect to the center of the screen, and the amount of zoom presently employed. Decision 107 tests whether the amount of pan required is greater than the resolution error of the pan mechanism. That is, if the amount of pan required is one degree but the pan mechanism has a resolution error of two degrees, then panning should not be done. If panning is not to be done then decision 108 is executed. Decision 108 tests whether the tilt required is greater than the resolution error of the tilt mechanism. If the tilt required is not greater than the resolution error then a return is made to decision 101 because it has been determined that neither pan nor tilt is required. If, in decision 108, the tilt required is greater than the resolution error then step 112 is executed next. Referring back to decision 107, if the pan required is greater than the resolution error then, in step 110, the pan rate is determined. Then, in decision 111, a test is made as to whether the tilt is greater than the resolution error. If not then step 113 is executed next. However, if the tilt is greater than the resolution error then the tilt rate is determined in step 112. Although this process causes the movement along both axes to be completed at the same time, an undesirable affect may occur when moving long distances, such as from one preset location to another when the field of view is narrow. Assume, for example, that the field of view is 6 degrees, and the pan angle will be 60 degrees. If the pan rate is selected to cause the object to move across the field of view (6 degrees) in time T, then it will take 10T seconds for the camera to reach its destination. However, if the pan rate is selected to cause the camera to traverse the full distance in T seconds, then the 6 degree field of view will cause objects to fly across the scene in a blur. Therefore, in the preferred embodiment, if the camera is to pan over a long distance the camera is zoomed out (and focused accordingly) so that the camera has a wide field of view. The high speed pan rate will then allow the movement from start to finish to occur in a timely manner but, because the camera is zoomed out, an object will be reduced in size and will move at an acceptable rate across the display screen. At the end of the pan operation the camera is zoomed in (and focused accordingly) as specified by the destination location.
Therefore, in decision 113, controller 10 determines whether the pan distance is sufficiently large to require zooming out. If not then step 115 is executed. If so then the camera is zoomed out and then step 115 is executed. In step 115 pan, tilt, and/or zoom, as required, are begun.
Decision 116 tests whether the pan/tilt operation has been completed. If not then a return is made to decision 116. If the operation is complete then the zoom and focus are restored in step 117, if necessary, and the process of camera movement is ended in step 118. The rate of pan and tilt are determined by considering the desired number of seconds that it should take an object to move from one end of the field of view to the other end of the field of view. In the preferred embodiment, this setting is programmable at controller 10. The display is considered to have a 2x3 aspect ratio (V to H). If it is desired that the object remain within the field of view for, for example, two seconds, and the field of view is, for example, 30 degrees, the pan speed will be set to 15 degrees per second and the tilt speed will be set to 10 degrees per second. By synchronizing the movements of the pan and tilt mechanisms in this manner the camera will reach the desired position, with respect to both axes, at approximately the same time. This has the desirable effect of making the camera positioning appear smooth. Otherwise, the camera may reach the desired position with respect to one axis first, for example the vertical axis, and then have to continue moving with respect to the other axis until the desired location is achieved, which makes the camera movement appear awkward.
The point-and-click method of camera control is a major improvement over existing button methods of camera control. However, if the field of view is narrow, it may take several point-and-click operations to pan the camera from one position to another position. Therefore, rather than follow the pointer movement only in discrete increments when the mouse button is clicked, the present invention provides an alternative form of movement. If this alternative form is selected by the user, such as by using a pull down menu or by pressing on a different mouse button such as button 12B, the camera will dynamically follow the pointer. In this case, if the pointer is moved slowly toward the side of the display controller 10 would cause the camera to slowly pan toward that side. When the pointer is positioned all the way to the side of the display, or at some predetermined border point, controller 10 instructs the pan/tilt unit to move at its maximum speed. Controller 10 automatically zooms out the camera when panning at high speed and automatically zooms in the camera to its original setting when the pointer is no longer at the side of the display and the pan speed is dropped to a slower rate. Of course, the user can adjust the zoom at any time.
Figures 6A and 6B are an illustration of the operation of the automatic zoom ("draw-and-release") feature of the present invention. Figure 6 A is an illustration of a monitor 21 having a screen 125, which is displaying a person 126 sitting at the end of a table 127. Assume now that the user wishes to focus on the person 126. Using a conventional system the user could adjust the pan and tilt controls and then adjust the zoom and focus controls so as to zoom in on person 126. However, using the present invention the user will simply use the mouse 12 to place the pointer at the desired pointer starting point (PSP), depress and hold a predetermined mouse button, such as the left button 12 A, and drag the pointer across the area of interest, which causes a rectangular box to begin spreading across the screen, with one corner at the PSP. When the user reaches the desired ending point, the pointer ending point (PEP), the user will release the mouse button. The user has thereby drawn a rectangle around the area of interest and released the mouse button. Controller 10 will then determine the appropriate pan and tilt for a camera and cause the camera to center its field of view on the center of the rectangle (CR), then cause the camera to zoom in so that rectangle 128 fills, as fully as possible, screen 125, and also cause the camera to refocus, if necessary. The resultant display is seen in Figure 6B, which illustrates that the camera has been repositioned so that CR is now in the middle of the display (MD). Therefore, by the simple tasks of positioning the pointer in one corner of the desired scene, depressing a mouse button, dragging the mouse to draw a rectangle, and releasing the mouse button, the user has caused the selected picture area to be expanded to fill the display 125. The use of point, click, drag, and release techniques to draw a box, such as box 128, are, in general, well known in the personal computer field.
Figure 7 is a flow chart of the method for controlling the aim point and the zoom operation of the camera. Upon starting 130, controller 10 tests, at decision 131 whether the appropriate mouse button has been depressed. If not then, in step 132, controller 10 tests for and/or performs other functions. If the mouse button has been depressed then, in step 133, controller 10 records the initial pointer position PSP. Then, in decision 134, controller 10 tests whether the mouse button has been released. If the mouse button has not been released then the user has not completed drawing the desired rectangle 128. Once the mouse button is released then the user has completed drawing rectangle 128 and has therefore designated the area of interest. Controller 10 therefore proceeds to step 135 and performs the following operations. First, the final pointer position PEP is recorded. Then the midpoint CR of the drawing rectangle 128 is calculated based upon the initial and final pointer positions PSP and PEP. Controller 10 then calculates the difference between the midpoint CR of rectangle 128 and the midpoint MD of display 125. These steps determine the pan and tilt required to center the desired picture on screen 125 and, although performed automatically, are analogous to the user moving the pointer to position CR and then clicking on the mouse, as in the procedure described with respect to Figure 5. Controller 10 then performs steps 106 through 117 of Figure 5 except that the "No" output of decision 108 does not return to step 101 but moves to substep 5 of step 135. The results of substeps 1-4 of step 135 is that controller 10 has caused the camera to pan and tilt so as to place the center CR of rectangle 128 at the midpoint MD of display 125. However, controller 10 must still determine how much zoom is required to satisfy the request of the user. Therefore, controller 10 determines the X- axis movement XM of the pointer and the Y-axis movement YM of the pointer. Controller 10 then adds the X-axis movement and the Y-axis movement to obtain the total movement of the pointer. Controller 10 then determines the ratio of the total movement (XM + YM) to the total size (XD + YD) of the screen 125 of monitor 21. Controller 10 then determines a new field of view by multiplying the above ratio times the current field of view. It will be appreciated that the current field of view is information which may be obtained from the zoom mechanism on the camera. Controller 10 then causes the camera to zoom to the new field of view or, if the new field of view is less than the minimum field of view supported by that camera, to zoom to the minimum field of view supported. Controller 10 then instructs the camera to focus, either by an auto focus process or by a memory process such as described below, and then the procedure ends.
The rectangle 128 illustrated in connection with Figure 6A has XM and YM proportions such that zooming in will cause rectangle 128 to nicely fill screen 125. However, it will be appreciated that the user may not always draw such a well proportioned rectangle. The user may draw a rectangle which is very wide and has minimal height or is very tall but has minimal width. In such a case, due to limitations imposed by the shape of screen 125, it is not possible to expand the picture as desired by the user. Therefore, an alternative process must be followed. One possible alternative approach is to expand rectangle 128 so that the larger of XM and YM is used to determine the zoom required. This approach will display to the user all of the area encompassed by rectangle 128 as well as some picture area outside of rectangle 128, as necessary to fill up screen 125. In another alternative approach, the smaller of XM and YM is used to determine the amount of zoom required. In this case the smaller measurement is expanded to fill up screen 125 and some of the area of rectangle 128 encompassed by the larger dimension of rectangle 128 will exceeds the limits of screen 125 and therefore will not be shown to the user. Audio and Video Control Nodes
Figure 8 is a schematic block diagram of a video unit control node 15. In the example shown, video unit control 15 is connected to three cameras 150A-150C, three monitors 21A-21C, and a VCR 20. It should be understood that the number of cameras, monitors and VCRs is a design choice and is limited only by the video switching capability of node 15, which is primarily determined by cost considerations. Video unit control node 15 selectively routes video signals from cameras 150, VCR 20, codec 16 and the auxiliary input, to monitors 21, codec 16, VCR 20 and the auxiliary output. As is well known in the art, codec 16 has a motion input and a motion output, for scenes which frequently change, and a graphics input and a graphics output for scenes which infrequently change, such as slides and graphs.
Video unit control node 15 comprises a plurality of video input buffers 151 designated generally as 151, which are connected to the inputs of an 8x8 video switch matrix 152, which is connected to a plurality of output buffers designated generally as 153, a control logic 154, a video overlay device 155, a sync generator input lock signal buffer 160, a plurality of sync separators 161 A- 161 C, a sync generator and phase locked loop (PLL) circuit 162, and a black burst output distribution amplifier 164. Buffers 151, which also perform DC restoration to the input signal, and buffers 153 buffer the incoming and outgoing video signals in a conventional manner. Likewise, switch matrix 152 switches the input signals from cameras 150, VCR 20, codec 16, the video overlay circuit 155, and the auxiliary input to the desired destination device, such as monitors 21, codec 16, VCR 20, and the video overlay circuit 155. Control logic 154 is connected between converter HE and switch matrix 152. As will be recalled from a reading of the operation of the system in conjunction with Figure 1 , converter HE extracts signals from network 23 which are intended for video control node 15 and converts the signals into the proper format for control node 15. Control logic 154 accepts the signals from converter HE and sends corresponding control signals to switch matrix 152, sync generator and PLL circuit 160, and video overlay circuit 155.
Sync generator input lock signal buffer 160 has an input connected to a Genlock input signal, and an output connected to a sync separator 161A. Sync separator 161 A, in a well known manner, recovers and separates the vertical synchronization signals from the horizontal synchronization signals. The output of buffer 160 and the output of sync separator 161 A are connected to inputs of sync generator and PLL circuit 162. Circuit 162 provides a black burst output which is synchronized to the selected input signal. For NTSC signals the output of buffer 160 is used as the sync source, for PAL signals the output of sync separator 161 A is used as the sync source. Control logic 154 directs circuit 162 as to which input signal should be used for synchronization.
The outputs of buffers 15 IC and 15 ID are connected to the inputs of sync separator circuits 161B and 161C, respectively. The outputs of circuits 161B and 161C are connected back to inputs of buffers 15 IC and 15 ID, respectively, so that DC restoration is performed based upon the actual input signal. In a similar manner, the outputs of buffers 151 A, 151B, and 151E-151H could be provided to sync separator circuits, and the outputs of the sync separation circuits routed back to their respective buffers. However, in the preferred embodiment, to reduce costs, control logic 154 provides a sync signal to these buffers for DC restoration. The sync signal provided by control logic 154 is preferably the sync signal provided by sync generator and PLL circuit 162. Buffers 151A, 151B, and 151E-151H are preferably used as inputs from devices, such as cameras, which can be synchronized to an external source. Buffers 15 IC and 15 ID are preferably used as inputs from devices, such as VCR's, which typically cannot be synchronized to an external source. Therefore, for devices which can be synchronized, DC restoration is performed based upon a master (Genlock) sync signal and, for devices which cannot be synchronized, DC restoration is performed based upon the sync signal from that device.
One output of sync generator and PLL circuit 162 is connected to an input of control logic 154. This allows control logic 154 to determine the start of a video frame or the start of a line so that video switching occurs at the proper place in a picture. Also, some codecs require information as to the vertical interval within which switching is to occur and control logic 154 uses the signal from sync circuit 162 to provide this information as well. The output of circuit 162 is connected to the input of a distribution amplifier 164 which provides several outputs G1-G4, which are black burst generator lock outputs. These outputs are used to synchronize cameras 150 so that the pictures from all cameras 150 are in sync.
Video overlay circuit 155 is used to provide special video effects such as picture within picture, and superimposed graphics and icons. Video overlay circuit 155 may be part of control node 15, part of controller 10, or an independent device.
The auxiliary input is used to provide graphical user interface (GUI) information such as video icons, control "buttons" on the monitor display, control borders and pointers, etc. In the preferred embodiment, this information is generated by controller 10. Methods of generating GUI information are well known to those of ordinary skill in the art.
Figure 9 is a schematic block diagram of an audio unit control node 14. Control node 14 selectively routes audio signals from various sources to various destinations. In the preferred embodiment, by way of example, audio inputs are from an auxiliary input, left and right channel inputs from VCR 20, microphones 174A-174D, a telephone connection, and the audio output of codec 16. Destinations for audio signals are, again by way of example, the record input of VCR 20, a telephone connection, and the audio input of codec 16. Any input audio signal may be routed to any desired destination and, likewise, any destination may receive any selected audio input signal. It will be appreciated that, with respect to the telephone line (TELCO) connection, additional circuitry, which is not shown, will be required to comply with FCC regulations regarding connection of devices to telephone lines and also to separate the combined input/output signal on the telephone line into input signals and output signals. Methods and devices for interfacing with the telephone line to accomplish line this are well known to those of ordinary skill in the art.
All input and all output signals are buffered, either by a plurality of buffers/amplifiers designated generally as 173 or a mixing circuit 172. The auxiliary input, the TELCO input, and the inputs from microphones 174A- 174D are buffered by buffers/amplifiers 173A-173C, respectively. Likewise, the input from codec 16 is buffered by buffer/amplifier 173E. The inputs from VCR 20 are buffered by mixer 172 A. The auxiliary input, the VCR 20 inputs, the TELCO input, the microphones 174A-174D inputs, and the codec 16 audio output are each passed through a muting circuit 170A-170E, respectively, and also through a gain control circuit 171A- 171H, respectively. The auxiliary input, VCR input, and TELCO input are then provided to a plurality of mixers designated generally as 172C. Mixers 172C contain separate mixers for the output to VCR 20, the output to the TELCO, and the output to the audio input of codec 16. However, in the preferred embodiment, the inputs from microphones 174 are routed to a digital signal processing echo canceller 176. The output of echo canceller 176 is then routed to the mixers 172C. The outputs of three of the mixers of 172C are routed through gain control circuits 171I- 171K and buffers/amplifiers 173E before being provided to VCR 20, the TELCO connection, and the audio input of codec 16. The audio output from codec 16 is routed through a gain control circuit 171H, a mute control circuit 170E, and then to the mixers 172C. The output of the fourth mixer of mixers 172C is routed to the received input of echo canceller 176. The received output of echo canceller 176 is routed through mute circuit 170F, gain control circuit 17 IL, and amplifier 173D, before being routed to speaker 175.
In the preferred embodiment, a mute circuit 170 comprises, as shown by mute circuit 170A, an analog switch. The mute circuits 170 are controlled by control logic 177. Likewise, in the preferred embodiment, gain control circuits 171, such as gain control 171 A, are digitally controlled gain circuits, and are controlled by control logic 177.
In the preferred embodiment, the user can use mouse 12 to pull down a menu and select a particular input or output device, and then select the gain or muting desired for that particular device. As previously mentioned, the signals from mouse 12 are provided by converter 11B to controller 10. Controller 10 interprets the mouse signals to determine the action requested by the user and, in this case, sends appropriate gain and mute signals to converter 11D. Converter 11D extracts this information from network 11 and sends the appropriate control signals to control logic 177 which, in turn, supplies the appropriate signals to the gain circuits 171 and the mute circuits 170.
As is well known in the art, some form of echo suppression or cancellation is generally desired and, in the preferred embodiment, echo canceller 176 is an echo cancellation card manufactured by Gentner Communications Corporation, Salt Lake City, Utah. Echoes are typically caused by feedback between a speaker 175 and microphones 174 in a room, and is made more noticeable and distracting by the time delay caused by codec 16 and the additional delay which occurs when the signal is transmitted via satellite. Camera Focusing
The present invention allows the selection of the camera focus to be controlled by the position of the camera. This feature establishes a database of the room layout and, when the user clicks and/or zooms in on a region the database is consulted to determine the focus settings and the database focus setting is automatically applied to the camera. If the selected objected is slightly out of focus the user will then adjust the focus setting manually. When the user manually adjusts the focus setting the region of the object and/or the appropriate focus setting are added to the database. Of course, it is quite likely that a user will not position a pointer in exactly the same place on the selected object or adjust the zoom to precisely the same degree every time. Therefore, the pan position, tilt position, and field of view angle may vary slightly from time to time, even though the user is designating the same object. In order to prevent the database from unnecessarily expanding and to reduce processing time in searching the database, the present invention uses regions, rather than pixels, to determine if the user has selected the same target. The database therefore consists of a tree of regions. A region is defined as a viewing area seen by a camera and is identified by a polar coordinate system which specifies a pan position, a tilt position, and a camera field of view angle. Figures 10A-10C are illustrations of the relationship between regions. Two regions are considered to match, or be the same region, if the intersection of the regions contains a certain percentage of each region, as shown in Fig. 10 A. In the preferred embodiment, this percentage is programmable and the default setting is 80%. If a selected region does not match a prerecorded region (Fig. 10B) then the focus setting for that new region is obtained from its closest parent region. A parent region is a region which completely encompasses another region, as shown in Fig. 10B. A parent region may be encompassed within another, larger region, and therefore one parent region may be the child of another parent region, as shown in Fig. IOC. At the limit, in the preferred embodiment there is a master parent region, which is a parent to all regions, and is the default focus setting. There is no fixed limit on the number of regions that may be stored in the database. However, in the preferred embodiment, a programmable limit on the number of regions is used and regions are discarded on a least recently used basis when necessary to accommodate the storage of settings for a new region. The present invention therefore allows the videoconferencing system (controller 10) to learn and remember the focus settings for the room and different objects within the room, and to dynamically adapt to changing room configuration and user preferences. Figures 10 A- IOC illustrate the relationship between fields. Turn now to Figures HA and 11B which are a flow chart of the camera focusing process of the present invention. Figure HA is entered whenever there is a change in the pan, tilt, zoom or focus settings of the camera. In step 201 controller 10 determines the polar region based upon the pan position, the tilt position, and the field of view angle (zoom setting). In decision 202 a determination is made as to whether the polar region is in the database. If so then in step 203 the focus setting is obtained from the matching polar region in the database and then step 205 is executed. If the polar region is not in the database then, in step 204, the focus setting is obtained for a parent region in the database and the step 205 is executed. It will be appreciated at this point that if there is a matching polar region then the focus setting will be extremely close to the desired focus setting. If there is not a matching polar region then by the use of parent regions, a focus setting is obtained which may be adequate or which will allow the user to easily fine tune the focus setting. In step 205 controller 10 sends signals to converter 11G and control node 17 to adjust the focus of the selected camera. Also, the start time for that focus setting is recorded. This start time is used in step 215 below. Decision 206 determines whether a new region has been selected, such as by point and click, draw-and-release, manual controls, etc. If so then a return is made to step 201. If not then decision 207 tests whether the user has adjusted the focus since it was set in step 205. If not then a return is made to decision 206. If the user has adjusted the focus then, in step 210, controller 10 sends signals which cause the focus to be adjusted according to the user's instructions and records the focus setting start time. In decision 211 controller 10 determines whether the current polar region is in the database. If so then controller 10 adjusts the focus setting in the database to correspond to the focus setting actually selected by the user and then returns to decision 206. By this process the focus for a particular polar region is made to conform to the user's particular desires. If the polar region is not in the database then decision 213 tests whether the database is full. If not then controller 10 adds the new polar region and the focus settings to the database and returns to decision 206. However, if the database is full then, in step 215, controller 10 searches the database for the least important region and discards that region. In the preferred embodiment, the least recently used region is deemed to be the least important region and is discarded. Of course, this is a desire preference and other criteria could be used to determine which region is to be discarded, such as: the least frequently used regions, regions which encompass a certain percentage of a parent region, a region which matches another region, etc. After the least important region is discarded then controller 10 adds the new region and focus setting to the database in step 214. Therefore, by the above process, the camera is automatically focused on the target selected by the user and, if the selected focus setting is unsatisfactory to the user and the user adjusts the focus setting then the user's selected focus setting is stored for use and is used the next time that the user selects that region.
Camera Construction Figure 12A is an illustration of the preferred embodiment of a camera 150 of the present invention. Camera 150 has a camera body 235, a focusing lens system 230, a zoom/field of view lens system 231 , a panning system 232, a tilt system 233, and a camera base 234. The design of focusing systems, zoom systems, panning systems, and tilt systems, and cameras themselves, are well known in the art. In the preferred embodiment, rather than systems 230-233 operating open loop with controller 10, the systems provide feedback to controller 10 so that controller 10 can evaluate the response of the system to the instruction sent.
Figure 12B is an illustration of the feedback system associated with the camera controls: systems 230-233, and control node 17. A feedback unit, which is part of systems 230-233, comprises a drive motor 240, a drive shaft 241, a position sensing means 242, and a drive train 243. Position sensing means 242 may be a variable resistor, a potentiometer, a digital shaft position encoder, etc. Drive train 243 drives the appropriate focusing, zooming, panning or tilting function. Systems 230-233 are connected to camera unit control node 17. Control node 17 comprises control logic 250, a motor power supply 251, and a position-to-digital converter 252. Assume that the user indicates that a selected camera should pan to the left. Controller 10 will send the appropriate instructions to converter 11G which, in turn, will transfer the instructions to control logic 250 of control node 17. Control logic 250 will, in turn, cause motor power supply 251 to apply the appropriate voltage to motor 240 to cause motor 240 to turn in the direction which, via drive shaft 241 and drive train 243, causes camera 150 to pan to the left. As draft shaft 241 rotates it changes the resistance of the positioning sensing means 242. The position-to-digital converter 252 converts the change in resistance to digital signals and provides these digital signals to control logic 250. In one embodiment, control logic 250 may close the loop and control motor power supply 251 so as to achieve the pan position specified by controller 10. In another embodiment control logic 250 sends the current pan position back to controller 10 and controller 10 determines whether the camera 150 has reached the desired position. Depending upon the particular motor used, control of motor 240 may be effected by the voltage, the pulse width, and/or the polarity of the voltage provided by motor power supply 251, which is controlled by control logic 250. Position-to-digital converter 252 may directly measure the resistance of a potentiometer in position sensing means 242, may apply a voltage across a potentiometer in position sensing means 242 and measure the output voltage from the potentiometer, or use other means, such as digital shaft position encoding techniques. The means of sensing the position is not critical but should be accurate enough to provide the degree of control necessary to satisfy the user. In the preferred embodiment, a pan position resolution of 0.1 degrees, a tilt position resolution of 0.1 degrees, and a field of view resolution of 0.03 degrees is used. The position sensing mechanism 242 may be a factory installed part of a system 230-233 or may be a retrofit. In the preferred embodiment, a camera 150 is a Hitachi CCD color camera model KB-C550, manufactured by Hitachi Denshi America, Woodbury, New York, and the lens is a Rainbow Automatic Iris electrically driven zoom lens model H6X8MEA-II, manufactured by International Space Optics, Huntington Beach, California. For clarity of illustration, Figure 1 illustrates only a single camera unit control node 17. However, in the preferred embodiment, there is a separate camera unit control node 17 and a separate converter 11G associated with each camera so that a camera 150 may be attached or removed from the system by connecting and disconnecting a minimum number of wires and cables. Although Figure 12B illustrates a separate motor power supply 251, position-to-digital converter 252, and control logic 250 for each system 230-233, the present invention is not so limited. If the motors 240 for the different systems 230-233 are of a similar type then a single motor power supply 251 may be used to control all the motors. Further, the changing of a setting, such as pan, tilt, focus and zoom, occurs at a relatively slow rate compared with other system operations. Therefore, it is possible to multiplex the outputs of several position sensing means 242 into a single position-to-digital converter 252, thereby reducing costs. Control logic 250 selects the appropriate position sensing means 242 in accordance with the motor 240 of the system 230 that is being driven and needs to be monitored. In this manner, a single control logic circuit 250, motor power supply 251, and position-to-digital converter 252, combined with a multiplexer (not shown), may be employed to service two or more systems 230-233.
Multiple Monitor Systems Figure 13 is an illustration of a two-monitor videoconferencing system of the present invention. In the illustration, there are two monitors 21 A, which depicts the scene seen by the local camera, and monitor 2 IB which depicts the scene seen by the remote camera. The local camera is showing a desk 300 with two persons 301 and 302, one of which is typically the user. Monitor 2 IB shows the remote scene which has a person 304 sitting at a desk or table 303. Monitor 21A also shows a control bar 270. It will be noted that person 304 is not centered in the display on monitor 2 IB but that the user wishes person 304 to be centered. The user will use mouse 12 to move cursor 280 to control bar 270 and pull down a camera selection menu 271. In one embodiment the menu will pull down by simply moving the cursor over the appropriate position on the control bar and, in another embodiment, the menu will be pulled down if the user positions the pointer over the appropriate position on the control bar and depresses or clicks a button 12 A, 12B on mouse 12. Methods for pulling down menus are well known in the personal computer field. Camera menu 271 lists the available cameras such as a local camera, a remote camera, and a graphics camera. In this case the user wishes to select the remote camera so the user will click on the appropriate spot 272 of menu 271 to select the remote camera. This will cause a second menu 273 to pull down listing the functions that can be performed with that camera, such as pan left/right, tilt up/down, zoom in/out, and focus. In this case the user wishes to move person 304 to the center of monitor 2 IB and decides to first pan the camera to center 304. The user will therefore select the panning function 274. This will cause a pan control icon 275 to appear on monitor 2 IB. Icon 275 shows arrows to allow the user to specify movement of the camera to the right 276 or to the left 277. The user will therefore position pointer 280 over the appropriate arrow and click and hold a mouse button 12A or 12B until the desired position of person 304 has been achieved. At that point the user can go back to menu 273 to select tilt and adjust the tilt position as desired, as well as the zoom and focus. Alternatively, the user could simply use point-and-click technique described above. That is, place pointer 280 in the middle of person 304 and click thereon, thereby causing controller 10 to automatically position person 304 in the center of monitor 2 IB. Also, the user could use the draw-and-release technique described above to cause person 304 to be centered in monitor 2 IB.
Although the control bar 270 and menus 271 and 273 are show in monitor 21 A and the icon 275 is shown in monitor 2 IB it will be appreciated that this is merely a design choice and that the control bar, menus, and icons may be displayed on either monitor and, if desired, can be moved, using control bar 270, from one monitor to the other. Mouse 12 is preferably used to move pointer 280 between the displays of monitors 21 A and 2 IB. The movement of a cursor or pointer between screens is well known in the personal computer field.
In the preferred embodiment, controller 10 also supports operation with picture-within-picture, split-screen, and four-quadrant picture operation. In these cases controller 10 controls, and therefore knows, the switching point between one picture and the next and therefore is able to determine whether the pointer is over a scene controlled by a first camera, a second camera, or even a remote camera. Monitor 2 IB illustrates a picture 281 within the broader picture illustrated. In this illustration, picture 281 is a view of a graph 282. The user could therefore position cursor 280 over picture 281 and controller 10 would know that the subsequent user actions were directed to picture 281 and not directed to the larger picture depicting user 304. If the picture 281 were being generated by a remote camera then controller 10 would send network standard signals corresponding to the desired action to the remote controller, which would cause the remote camera to take the desired action.
The source of the picture 281 may be any camera which is selectable. The video unit control node 15 is programmed by controller 10 to dynamically connect the appropriate video signals between the appropriate devices so that picture-within-picture and other types of pictures may be obtained. Methods for achieving various multiple picture presentations are well known in the television broadcasting field.
Other embodiments of the present invention will become apparent to those of skill in the art after a reading of the detailed description above and an inspection of the accompanying drawing figures. Therefore, the scope of the present invention is to be limited only by the claims below.

Claims

ClaimsWe claim:
1. A videoconferencing system, comprising: a network for carrying network standard signals; a user input device responsive to input by a user for providing user control signals, said user control signals being included in a predetermined set of user control signals; a first converter, connected to said user input device and to said network, for receiving said user control signals, for converting said user control signals into network standard user control signals, and for placing said network standard user control signals onto said network; a camera assembly responsive to camera control signals for providing a picture, said camera control signals being included in a predetermined set of camera control signals; a second converter, connected to said network and to said camera assembly, for receiving network standard camera control signals, for converting said network standard camera control signals into said camera control signals, and for providing said camera control signals to said camera assembly; a monitor, functionally connected to said camera assembly, for displaying said picture; and a controller, connected to said network, for receiving said network standard user control signals, for determining a camera assembly action specified by said network standard user control signals, for determining said network standard camera control signals necessary to implement said camera assembly action, and for placing said network standard camera control signals onto said network.
2. The videoconferencing system of Claim 1 wherein: said monitor is responsive to monitor control signals for displaying said picture, said monitor control signals being included in a predetermined set of monitor control signals; said controller is further responsive to said network standard user control signals for determining network standard monitor control signals necessary to implement an action specified by said user using said user input device, and placing said network standard monitor control signals onto said network; and said videoconferencing system further comprises a third converter, connected to said network and to said monitor, for receiving said network standard monitor control signals from said network, converting said network standard monitor control signals into said monitor control signals and providing said monitor control signals to said monitor.
3. The videoconferencing system of Claim 2 wherein: said third converter comprises means, functionally connected to said monitor, for determining a predetermined status condition of said monitor, and said third converter places onto said network a network standard monitor status signal indicating said predetermined status condition; and said controller receives said network standard monitor status signal and inspects said network standard monitor status signal to determine whether said monitor has implemented a preceding said monitor control signal.
4. The videoconferencing system of Claim 3 wherein said means for determining comprises means for detecting radiation emitted by said monitor.
5. The videoconferencing system of Claim 1 wherein said user input device comprises a mouse.
6. The videoconferencing system of Claim 1 wherein said user input device comprises a joystick.
7. The videoconferencing system of Claim 1 wherein said user input device comprises a control panel.
8. The videoconferencing system of Claim 1 and further comprising: a second monitor for displaying a remote picture; and a communications device connected to a telephone line, for exchanging video signals with a remote videoconferencing system over said telephone line, functionally connected to said camera assembly for providing said picture from said camera assembly as said video signals to said remote videoconferencing system, and functionally connected to said second monitor for receiving said video signals from said remote videoconferencing system and providing said video signals to said second monitor as said remote picture.
9. The videoconferencing system of Claim 8 wherein said communications device comprises a coder-decoder.
10. The videoconferencing system of Claim 8 and further comprising: a video control unit, functionally connected to said camera assembly, said communications device, said monitor, and said second monitor, and responsive to network standard video control signals for selectively routing said picture from said camera to said communications device, said monitor, and said second monitor, and responsive to said network standard video control signals for selectively routing said remote picture from said communications device to said monitor and said second monitor; and wherein said controller is responsive to said network standard user control signals for determining said network standard video control signals necessary to selectively route said picture from said camera and to selectively route said remote picture from said communications device as specified by said user using said input control device.
11. The videoconferencing system of Claim 10 wherein said video control unit comprises: a video switching unit responsive to video control signals, said video control signals being included in a predetermined set of video control signals, for selectively routing said picture and said remote picture; and a third converter, connected to said network and to said video switching unit, for receiving said network standard video control signals, for converting said network standard video control signals to said video control signals, and providing said video control signals to said video switching unit.
12. The videoconferencing system of Claim 1 wherein: said first converter comprises a microprocessor and a memory; and said controller contains a program, said program containing instructions as to converting said user control signals to said network standard user control signals, and said controller loads said program into said memory of said first converter using said network.
13. The videoconferencing system of Claim 1 wherein: said second converter comprises a microprocessor and a memory; and said controller contains a program, said program containing instructions as to converting said network standard camera control signals to said camera control signals, and said controller loads said program into said memory of said second converter using said network.
14. The videoconferencing system of Claim 1 wherein: said second converter comprises a microprocessor and a memory; and said controller contains a plurality of programs, each program of said plurality of programs containing instructions as to converting said network standard camera control signals into said camera control signals for a specific type of said camera assembly, and said controller is responsive to said network standard user control signals for loading into said memory a user-selected program of said plurality of programs.
15. The videoconferencing system of Claim 1 and further comprising: a peripheral device responsive to peripheral device control signals for performing predetermined functions; and a peripheral device converter, connected to said peripheral device and to said network, for receiving network standard peripheral device control signals, for converting said network standard peripheral device control signals into said peripheral device control signals, and for providing said peripheral device control signals to said peripheral device to cause said peripheral device to perform said predetermined functions designated by said peripheral device control signals.
16. The videoconferencing system of Claim 15 wherein: said peripheral device controller comprises a microprocessor and a memory; and said controller contains a plurality of programs, each program of said plurality of programs containing instructions as to converting said network standard peripheral device control signals into said peripheral device control signals for a specific type of said peripheral device, and said controller is responsive to said network standard user control signals for loading into said memory a user-selected program of said plurality of programs.
17. The videoconferencing system of Claim 16 wherein said peripheral device is a camera assembly.
18. The videoconferencing system of Claim 16 wherein said peripheral device is a video cassette recorder.
19. The videoconferencing system of Claim 16 wherein said peripheral device is at least one of a modulator-demodulator and a coder- decoder.
20. The videoconferencing system of Claim 16 wherein said peripheral device is a monitor.
21. A videoconferencing system, comprising: a first camera assembly responsive to camera control signals for providing video signals representing a first picture; a user input device responsive to input by a user for providing user control signals; a video control unit for combining said video signals from said first camera and user-control option video signals to provide combined video signals; a controller for generating said user-control option video signals, said user-control option video signals representing a camera control menu and a pointer, said controller being functionally connected to said user input device and responsive to said user control signals for positioning said pointer at a user-designated point on said camera control menu, and being responsive to said user control signals and said user-designated point for determining a first camera action desired by said user, and generating first camera control signals to cause said first camera action, said controller being functionally connected to said first camera; and a monitor, functionally connected to said video control unit, and responsive to said combined video signals for displaying said first picture, said pointer, and said camera control menu.
22. The videoconferencing system of Claim 21 and further comprising: a second camera assembly responsive to second camera control signals for providing video signals representing a second picture; wherein said video control unit is responsive to commands from said controller for providing said combined video signals from a least one of said video signals from said first camera, said video signals from said second camera, and said user-control option video signals; wherein said controller causes said user-control option video signals to represent a camera selection menu, said camera selection menu comprising a first camera indicator and a second camera indicator, and said controller is further responsive to said user control signals for positioning said pointer on a user-designated camera indicator in said camera selection menu, and is further responsive to said user control signals and said user- designated camera indicator for sending said commands to said video control unit; whereby said monitor displays a user-selected one of said first picture and said second picture.
23. The videoconferencing system of Claim 22 wherein said controller is further responsive to said user control signals for determining a second camera action desired by said user and for generating second camera control signals to cause said second camera action, said controller being functionally connected to said second camera.
24. A method for adjusting a camera, comprising the steps of: providing a picture; providing a pointer at a first location within said picture; monitoring an output of a control device for an indication by a user to move said pointer; moving said pointer to a second location in response to said indication; monitoring said output of said control device for an indication by said user that said pointer is designating a desired object; and panning said camera to position said desired object in the center of said picture.
25. The method of Claim 24 wherein said step of panning comprises: determining an amount of pan required to position said desired object in said center of said picture; and panning said camera by said amount of pan.
26. The method of Claim 25 and, after said step of determining said amount of pan, further comprising: determining whether said amount of pan exceeds a predetermined value; and if said amount of pan exceeds said predetermined value then performing said step of panning.
27. The method of Claim 25 and, after said step of determining said amount of pan, further comprising: determining whether said amount of pan exceeds a predetermined value; and if said amount of pan exceeds said predetermined value then zooming out said camera as said step of panning said camera is being started.
28. The method of Claim 27 and further comprising zooming in said camera as said step of panning said camera is being completed.
29. The method of Claim 24 and further comprising tilting said camera to position said desired object in said center of said picture.
30. The method of Claim 29 wherein: said step of panning comprises determining an amount of pan required to position said desired object in said center of said picture, and panning said camera by said amount of pan; and said step of tilting comprises determining an amount of tilt required to position said desired object in said center of said picture, and tilting said camera by said amount of tilt.
31. The method of Claim 30 and, prior to said step of panning said camera by said to pan and said step of tilting said camera by said amount of tilt, further comprising: adjusting at least one of a rate of pan and a rate of tilt so that said step of panning said camera and said step of tilting said camera are completed at approximately a single point in time.
32. The method of Claim 31 wherein said step of adjusting at least one comprises: determining an amount of time required to perform said step of panning at a maximum rate of pan; determining an amount of time required to perform said step of tilting at a maximum rate of tilt; if said amount of time required to perform said step of panning is greater than said amount of time required to perform said step of tilting then decreasing said rate of tilt; and if said amount of time required to perform said step of panning is less than said amount of time required to perform said step of tilting then decreasing said rate of pan.
33. A method for adjusting a camera, comprising the steps of: providing a picture; providing a pointer at a first location within said picture; monitoring an output of a control device for an indication by a user to move said pointer; moving said pointer to a second location in response to said indication; monitoring said output of said control device for an indication by said user that said pointer is designating a desired object; and tilting said camera to position said desired object in the center of said picture.
34. The method of Claim 33 wherein said step of tilting comprises: determining an amount of tilt required to position said desired object in said center of said picture; and tilting said camera by said amount of tilt.
35. The method of Claim 34 and, after said step of determining said amount of tilt, further comprising: determining whether said amount of tilt exceeds a predetermined value; and if said amount of tilt exceeds said predetermined value then performing said step of tilting.
36. The method of Claim 34 and, after said step of determining said amount of tilt, further comprising: determining whether said amount of tilt exceeds a predetermined value; and if said amount of tilt exceeds said predetermined value then zooming out said camera as said step of tilting said camera is being started.
37. The method of Claim 36 and further comprising zooming in said camera as said step of tilting said camera is being completed.
38. A method for adjusting a camera, comprising the steps of: determining an amount of pan required to move a camera from a first position to a second position; determining whether said amount of pan exceeds a predetermined value; and if said amount of pan exceeds said predetermined value then performing said step of panning.
39. The method of Claim 38 and, after said step of determining said amount of pan, further comprising: determining whether said amount of pan exceeds a second predetermined value; and if said amount of pan exceeds said second predetermined value then zooming out said camera as said step of panning said camera is being started.
40. The method of Claim 39 and further comprising zooming in said camera as said step of panning said camera is being completed.
41. The method of Claim 38 and further comprising tilting said camera to center said desired object in said picture.
42. The method of Claim 41 wherein: said step of panning comprises determining an amount of pan required to center said desired object in said picture, and panning said camera by said amount of pan; and said step of tilting comprises determining an amount of tilt required to center said desired object in said picture, and tilting said camera by said amount of tilt.
43. The method of Claim 42 and, prior to said step of panning said camera and said step of tilting said camera, further comprising: adjusting at least one of a rate of pan and a rate of tilt so that said step of panning said camera and said step of tilting said camera are completed at approximately a single point in time.
44. The method of Claim 43 wherein said step of adjusting at least one comprises: determining an amount of time required to perform said step of panning at a maximum rate of pan; determining an amount of time required to perform said step of tilting at a maximum rate of tilt; if said amount of time required to perform said step of panning is greater than said amount of time required to perform said step of tilting then decreasing said rate of tilt; and if said amount of time required to perform said step of panning is less than said amount of time required to perform said step of tilting then decreasing said rate of pan.
45. A method for adjusting a camera, comprising the steps of: determining an amount of tilt required to move a camera from a first position to a second position; determining whether said amount of tilt exceeds a predetermined value; and if said amount of tilt exceeds said predetermined value then performing said step of tilting.
46. The method of Claim 45 and, after said step of determining said amount of tilt, further comprising: determining whether said amount of tilt exceeds a predetermined value; and if said amount of tilt exceeds said predetermined value then zooming out said camera as said step of tilting said camera is being started.
47. The method of Claim 46 and further comprising zooming in said camera as said step of tilting said camera is being completed.
48. The method of Claim 45 and further comprising panning said camera to center said desired object in said picture.
49. The method of Claim 48 wherein: said step of panning comprises determining an amount of pan required to center said desired object in said picture, and panning said camera by said amount of pan; and said step of tilting comprises determining an amount of tilt required to center said desired object in said picture, and tilting said camera by said amount of tilt.
50. The method of Claim 49 and, prior to said step of panning said camera and said step of tilting said camera, further comprising: adjusting at least one of a rate of pan and a rate of tilt so that said step of panning said camera and said step of tilting said camera are completed at approximately a single point in time.
51. The method of Claim 50 wherein said step of adjusting at least one comprises: determining an amount of time required to perform said step of panning at a maximum rate of pan; determining an amount of time required to perform said step of tilting at a maximum rate of tilt; if said amount of time required to perform said step of panning is greater than said amount of time required to perform said step of tilting then decreasing said rate of tilt; and if said amount of time required to perform said step of panning is less than said amount of time required to perform said step of tilting then decreasing said rate of pan.
52. A method for adjusting a camera, comprising the steps of: providing a picture; providing a pointer at a first location within said picture; monitoring an output of a control device for an indication by a user to draw an area of interest; drawing said area of interest on said picture in response to said indication; monitoring said output of said control device for an indication by said user that said area of interest is completed; moving said camera so that said area of interest is centered within said picture; and zooming said camera so that said area of interest fills at least a predetermined portion of said picture.
53. The method of Claim 52 wherein said step of moving comprises: determining the center of said picture; determining the center of said area of interest; and panning said camera to position said center of said area of interest over said center of said picture.
54. The method of Claim 53 wherein said step of panning said camera comprises: determining an amount of pan required to position said center of said area of interest over said center of said picture; and panning said camera by said amount of pan.
55. The method of Claim 54 and, after said step of determining said amount of pan, further comprising: determining whether said amount of pan exceeds a predetermined value; and if said amount of pan exceeds said predetermined value then performing said step of panning.
56. The method of Claim 54 and, after said step of determining said amount of pan, further comprising: determining whether said amount of pan exceeds a predetermined value; and if said amount of pan exceeds said predetermined value then zooming out said camera as said step of panning said camera is being started.
57. The method of Claim 56 and further comprising zooming in said camera as said step of panning said camera is being completed.
58. The method of Claim 52 wherein said step of moving comprises: determining the center of said picture; determining the center of said area of interest; and tilting said camera to position said center of said area of interest over said center of said picture.
59. The method of Claim 58 wherein said step of tilting said camera comprises: determining an amount of tilt required to position said center of said area of interest over said center of said picture; and tilting said camera by said amount of tilt.
60. The method of Claim 58 and, after said step of determining said amount of tilt, further comprising: determining whether said amount of tilt exceeds a predetermined value; and if said amount of tilt exceeds said predetermined value then performing said step of tilting.
61. The method of Claim 52 and, after said step of determining said amount of tilt, further comprising: determining whether said amount of tilt exceeds a predetermined value; and if said amount of tilt exceeds said predetermined value then zooming out said camera as said step of tilting said camera is being started.
62. The method of Claim 61 and further comprising zooming in said camera as said step of tilting said camera is being completed.
63. A method for adjusting a camera, comprising the steps of: dividing a range of operation of said camera into a plurality of regions; providing a picture from said camera; determining in which region of said plurality of regions said picture occurs; determining a focus setting for said camera in response to said region in which said picture occurs; and focusing said camera to said focus setting.
64. The method of Claim 63 wherein said step of determining said focus setting comprises: searching a database containing said plurality of regions for a matching region to said region in which said picture occurs; and if said matching region is found then reading said focus setting for said matching region from said database.
65. The method of Claim 64 and further comprising: monitoring an output of a control device for an indication by a user to change said focus setting for said camera; focusing said camera in response to said indication; and changing said focus setting for said matching region in said database in response to said indication.
66. The method of Claim 64 wherein said step of determining said focus setting further comprises: if said matching region is not found then searching said database for a parent region to said region in which said picture occurs; and if said parent region is found then reading said focus setting for said parent region from said database.
67. The method of Claim 66 and further comprising: monitoring an output of a control device for an indication by a user to change said focus setting for said camera; focusing said camera in response to said indication; generating a new focus setting in response to said indication; and storing said region in which said picture occurs and said new focus setting in said database.
68. The method of Claim 66 wherein said step of determining said focus setting further comprises: if said parent region is not found then using a default value for said focus setting.
69. The method of Claim 68 and further comprising: monitoring an output of a control device for an indication by a user to change said focus setting for said camera; focusing said camera in response to said indication; generating a new focus setting in response to said indication; and storing said region in which said picture occurs and said new focus setting in said database.
70. A method for adjusting a camera, comprising the steps of: pointing said camera at a first position; monitoring a control device for an indication by a user to pan said camera from said first position to a second position; determining an amount of pan required to pan said camera in accordance with said indication; comparing said amount of pan to a predetermined value; if said amount of pan is less than said predetermined value then panning said camera to said second position; and if said amount of pan exceeds said predetermined value then zooming out said camera and panning said camera to said second position.
71. The method of Claim 70 and if said amount of pan exceeds said predetermined value then further comprising zooming in said camera as said step of panning is being completed.
72. A method for adjusting a camera, comprising the steps of: pointing said camera at a first position; monitoring a control device for an indication by a user to tilt said camera from said first position to a second position; determining an amount of tilt required to pan said camera in accordance with said indication; comparing said amount of tilt to a predetermined value; if said amount of tilt is less than said predetermined value then tilting said camera to said second position; and if said amount of tilt exceeds said predetermined value then zooming out said camera and tilting said camera to said second position.
73. The method of Claim 72 and if said amount of tilt exceeds said predetermined value then further comprising zooming in said camera as said step of tilting is being completed.
74. A videoconferencing system with error detection, comprising: a camera for providing a picture, said camera including a zoom mechanism and a focus mechanism; a pan and tilt mechanism for pointing said camera to a desired position; a zoom position indicator for providing a zoom status signal; a focus position indicator for providing a focus status signal; a pan position indicator for providing a pan status signal; a tilt position indicator for providing a tilt status signal; a monitor for displaying said picture; and a controller for providing zoom and focus commands to said camera, and for providing pan and tilt commands to said pan and tilt mechanism, for monitoring said zoom status signal, said focus status signal, said pan status signal, and said tilt status signal to determine whether said commands were properly executed, for determining that an error has occurred if any of said commands were not properly executed, and for providing an error detection signal if a said error has occurred.
75. The videoconferencing system of Claim 74 wherein said controller has a memory and maintains in said memory a record of said error and an indication of the command associated with said error.
76. The videoconferencing system of Claim 75 and further comprising a modem for connecting said controller to a telephone line and wherein said controller provides a notice of said error and said indication of the command associated with said error via said modem to a remote location.
77. The videoconferencing system of Claim 76 wherein said controller automatically provides said notice of said error and said indication of the command associated with said error upon determining that said error has occurred.
78. The videoconferencing system of Claim 76 wherein said controller provides said notice of said error and said indication of the command associated with said error in response to said remote location calling into said modem.
79. The videoconferencing system of Claim 76 wherein said controller provides said notice of said error and said indication of the command associated with said error in response to said remote location calling into said modem.
80. The videconferencing system of Claim 74 wherein said controller, upon determining that said error has occurred, identifies a command associated with said error and provides a second command, said second command specifying a function which is reverse to said function specified by said command associated with said error.
81. The videconferencing system of Claim 80 wherein said controller monitors said status signals to determine whether said second command was properly executed.
82. The videconferencing system of Claim 81 wherein said controller provides a first said error detection signal if said second command was properly executed and provides a second said error detection signal if said second command was not properly executed.
83. The videconferencing system of Claim 81 wherein, if said second command was properly executed, said controller again sends said command associated with said error and monitors said status signals to determine whether said command associated with said error was properly executed.
84. The videconferencing system of Claim 83 wherein said controller provides a first said error detection signal if said command which was sent again was properly executed and provides a second said error detection signal if said command which was sent again was not properly executed.
85. A videoconferencing system with error detection, comprising: a camera for providing a picture; a monitor for displaying said picture; and a transmitter responsive to a command signal for sending an infrared control signal to said monitor; a detector for providing a monitor status signal in response to detecting a response from said monitor to said infrared control signal; and a controller for sending said command signal to said transmitter, and for monitoring said monitor status signal to determine whether said command was properly executed by said monitor, for determining that an error has occurred if said command was not properly executed by said monitor, and for providing an error detection signal if said error has occurred.
86. The videoconferencing system of Claim 85 wherein said controller has a memory and maintains in said memory a record of said error and an indication of the command associated with said error.
87. The videoconferencing system of Claim 86 and further comprising a modem for connecting said controller to a telephone line and wherein said controller provides a notice of said error and said indication of the command associated with said error via said modem to a remote location.
88. The videoconferencing system of Claim 87 wherein said controller automatically provides said notice of said error and said indication of the command associated with said error upon determining that said error has occurred.
89. The videoconferencing system of Claim 87 wherein said controller provides said notice of said error and said indication of the command associated with said error in response to said remote location calling into said modem.
90. The videconferencing system of Claim 85 wherein said controller, upon determining that said error has occurred, identifies a command associated with said error and provides a second command, said second command specifying a function which is reverse to said function specified by said command associated with said error.
91. The videconferencing system of Claim 90 wherein said controller monitors said status signal to determine whether said second command was properly executed.
92. The videconferencing system of Claim 91 wherein said controller provides a first said error detection signal if said second command was properly executed and provides a second said error detection signal if said second command was not properly executed.
93. The videconferencing system of Claim 91 wherein, if said second command was properly executed, said controller again sends said command associated with said error and monitors said status signal to determine whether said command associated with said error was properly executed.
94. The videconferencing system of Claim 93 wherein said controller provides a first said error detection signal if said command which was sent again was properly executed and provides a second said error detection signal if said command which was sent again was not properly executed.
95. A videoconferencing system with error detection, comprising: a camera for providing a picture; a peripheral device for performing a specified function in response to a command signal; a detector for providing a peripheral device status signal in response to detecting said peripheral device performing said specified function; a monitor for displaying said picture; and a controller for providing said command to said peripheral device, for monitoring said peripheral device status signal to determine whether said command was properly executed by said peripheral device, for determining that an error has occurred if said command was not properly executed by said peripheral device, and for providing an error detection signal if said error has occurred.
96. The videoconferencing system of Claim 95 wherein said controller has a memory and maintains in said memory a record of said error and an indication of the command associated with said error.
97. The videoconferencing system of Claim 96 and further comprising a modem for connecting said controller to a telephone line and wherein said controller provides a notice of said error and said indication of the command associated with said error via said modem to a remote location.
98. The videoconferencing system of Claim 97 wherein said controller automatically provides said notice of said error and said indication of the command associated with said error upon determining that said error has occurred.
99. The videoconferencing system of Claim 97 wherein said controller provides said notice of said error and said indication of the command associated with said error in response to said remote location calling into said modem.
100. The videconferencing system of Claim 95 wherein said controller, upon determining that said error has occurred, provides a second command to said peripheral device, said second command specifying a function which is reverse to said function specified by said command associated with said error.
101. The videconferencing system of Claim 100 wherein said controller monitors said peripheral device status signal to determine whether said second command was properly executed.
102. The videconferencing system of Claim 101 wherein said controller provides a first said error detection signal if said second command was properly executed and provides a second said error detection signal if said second command was not properly executed.
103. The videconferencing system of Claim 101 wherein, if said second command was properly executed, said controller again sends said command associated with said error to said peripheral device and monitors said peripheral device status signal to determine whether said command associated with said error was properly executed.
104. The videconferencing system of Claim 103 wherein said controller provides a first said error detection signal if said command which was sent again was properly executed and provides a second said error detection signal if said command which was sent again was not properly executed.
105. A converter for converting network standard control signals, received from a controller over a network, into device-specific control signals for controlling a device, comprising: input means, connected to said network, for receiving said network standard control signals; a memory for storing a conversion program, said conversion program containing instructions as to converting said network standard control signals into said device-specific control signals; output means, for providing said device-specific control signals to said device; and a microprocessor, connected to said input means, said memory, and said output means, for receiving said conversion program from said controller via said network and said input means, for causing said memory to store said conversion program, for executing said instructions in said conversion program to convert said network standard control signals received by said input means into device-specific control signals, and for causing said output means to provide said device-specific control signals to said device.
106. The converter of Claim 105 wherein said memory contains a basic program and said microprocessor executes said basic program to determine whether a said conversion program has been stored in said memory, and to provide a notice if a said conversion program has not been stored.
107. The converter of Claim 106 and further comprising a second output means connected to said network for placing network standard control signals on said network, and wherein said microprocessor provides said notice to said controller via said second output means.
108. The converter of Claim 106 and further comprising a visible display device, and wherein said microprocessor provides said notice by causing said visible display device to provide a predetermined visible display.
109. The converter of Claim 106 and further comprising user- operable means to cause said microprocessor to provide said notice.
110. A converter for converting device-specific control signals from a device into network standard control signals and providing said network standard control signals to a controller over a network, comprising: input means for receiving said device specific control signals from said device; input/output means, connected to said network, for receiving conversion program signals over said network and for placing said network standard control signals onto said network; a memory for storing a conversion program, said conversion program containing instructions as to converting said device-specific control signals into said network standard control signals; and a microprocessor, connected to said input means, said input/output means, and said memory, for receiving said conversion program from said controller via said network and said input/output means, for causing said memory to store said conversion program, for executing said instructions in said conversion program to convert said device-specific control signals received by said input means into said network standard control signals, and for causing said input/output means to place said network standard control signals onto said network.
111. The converter of Claim 110 wherein said memory contains a basic program and said microprocessor executes said basic program to determine whether a said conversion program has been stored in said memory, and to provide a notice if a said conversion program has not been stored.
112. The converter of Claim 111 wherein said microprocessor provides said notice to said controller via said input/output means.
113. The converter of Claim 111 and further comprising a visible display device, and wherein said microprocessor provides said notice by causing said visible display device to provide a predetermined visible display.
114. The converter of Claim 111 and further comprising user- operable means to cause said microprocessor to provide said notice.
115. A videoconferencing system, comprising: a first video device responsive to first video control signals for providing first video signals representing a first picture; a second video device responsive to second video control signals for providing second video signals representing a second picture; a user input device responsive to input by a user for providing user control signals; a video control unit for combining said first video signals from said first video assembly, said second video assembly, and user-control option video signals to provide combined video signals representing a combined picture; a controller for causing said video control unit to provide said combined video signals, for generating said user-control option video signals, said user-control option video signals representing device control options and a pointer, said controller being functionally connected to said user input device and responsive to said user control signals for positioning said pointer at a user-designated point on said combined picture, and being responsive to said user-designated point for determining which one of said video devices is providing said video signals corresponding to said combined picture at said user-designated point, and being further responsive to said user control signals for determining a video device action desired by said user, for generating video control signals to cause said video device action, and for providing said video control signals to said one of said video devices, said controller being functionally connected to said first video device, said second video device, said user input device, and said video control unit; and a monitor, functionally connected to said video control unit, and responsive to said combined video signals for displaying said combined picture.
116. The videoconferencing system of Claim 115 wherein at least one of said first video device and said second video device is a camera.
PCT/US1994/010968 1993-10-20 1994-09-28 Adaptive videoconferencing system WO1995011566A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP94929914A EP0724809A1 (en) 1993-10-20 1994-09-28 Adaptive videoconferencing system
JP7511844A JPH09506217A (en) 1993-10-20 1994-09-28 Adaptive video conference system
AU79210/94A AU7921094A (en) 1993-10-20 1994-09-28 Adaptive videoconferencing system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US13964593A 1993-10-20 1993-10-20
US08/139,645 1993-10-20

Publications (1)

Publication Number Publication Date
WO1995011566A1 true WO1995011566A1 (en) 1995-04-27

Family

ID=22487635

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1994/010968 WO1995011566A1 (en) 1993-10-20 1994-09-28 Adaptive videoconferencing system

Country Status (8)

Country Link
US (7) US5515099A (en)
EP (1) EP0724809A1 (en)
JP (1) JPH09506217A (en)
CN (1) CN1135823A (en)
AU (1) AU7921094A (en)
CA (1) CA2174336A1 (en)
SG (1) SG67927A1 (en)
WO (1) WO1995011566A1 (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0713331A1 (en) * 1994-11-17 1996-05-22 Canon Kabushiki Kaisha Camera control device
DE19624151A1 (en) * 1995-10-19 1997-04-24 Fujitsu Ltd Video presentation system
EP0821522A2 (en) * 1996-07-23 1998-01-28 Canon Kabushiki Kaisha Camera control apparatus and method
EP0944259A2 (en) * 1998-03-16 1999-09-22 plettac AG Control of camera movement
FR2782877A1 (en) * 1998-08-31 2000-03-03 France Telecom AUTOMATIC SOUND AND IMAGE SYSTEM
US6469737B1 (en) 1996-07-23 2002-10-22 Canon Kabushiki Kaisha Image-sensing server and control method and storage medium therefor
US6484195B1 (en) 1996-07-23 2002-11-19 Canon Kabushiki Kaisha Server, terminal and control methods for transmitting real-time images over the internet
WO2003013140A1 (en) * 2001-07-25 2003-02-13 Stevenson Neil J A camera control apparatus and method
WO2003013128A1 (en) 2001-07-27 2003-02-13 Honeywell Limited A control system for allowing an operator to proportionally control a work piece
FR2852473A1 (en) * 2003-03-13 2004-09-17 France Telecom Remote video processing network control process for use in videophonic telecommunication, involves execution of modification command on video flow by video processing network before transmitting it to terminal e.g. server
EP2148496A1 (en) * 1998-05-01 2010-01-27 Canon Kabushiki Kaisha Image communication apparatus and its control method
EP2066116A3 (en) * 2007-11-28 2010-03-24 Sony Corporation Imaging apparatus and method, information processing apparatus and method, and recording medium storing a program therefor
WO2011030097A1 (en) * 2009-09-11 2011-03-17 The Vitec Group Plc Camera system control and interface
EP1830567A4 (en) * 2004-12-21 2011-05-18 Zte Corp A method for upgrading software in the teleconference video terminal
US8879721B2 (en) 2006-03-01 2014-11-04 Sony Corporation Audio communication system
US9774788B2 (en) 2007-02-16 2017-09-26 Axis Ab Providing area zoom functionality for a camera

Families Citing this family (401)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE4290947T1 (en) * 1991-04-08 1993-04-01 Hitachi, Ltd., Tokio/Tokyo, Jp
US10361802B1 (en) 1999-02-01 2019-07-23 Blanding Hovenweep, Llc Adaptive pattern recognition based control system and method
US8352400B2 (en) 1991-12-23 2013-01-08 Hoffberg Steven M Adaptive pattern recognition based controller apparatus and method and human-factored interface therefore
US7509270B1 (en) * 1992-12-09 2009-03-24 Discovery Communications, Inc. Electronic Book having electronic commerce features
US7401286B1 (en) * 1993-12-02 2008-07-15 Discovery Communications, Inc. Electronic book electronic links
US6675386B1 (en) * 1996-09-04 2004-01-06 Discovery Communications, Inc. Apparatus for video access and control over computer network, including image correction
US8073695B1 (en) 1992-12-09 2011-12-06 Adrea, LLC Electronic book with voice emulation features
EP0920207B2 (en) 1992-12-09 2006-09-27 Sedna Patent Services, LLC Interactive terminal for television delivery system
US7298851B1 (en) * 1992-12-09 2007-11-20 Discovery Communications, Inc. Electronic book security and copyright protection system
US7849393B1 (en) 1992-12-09 2010-12-07 Discovery Communications, Inc. Electronic book connection to world watch live
US7835989B1 (en) 1992-12-09 2010-11-16 Discovery Communications, Inc. Electronic book alternative delivery systems
JP3382276B2 (en) 1993-01-07 2003-03-04 キヤノン株式会社 Electronic device and control method thereof
USRE43462E1 (en) 1993-04-21 2012-06-12 Kinya (Ken) Washino Video monitoring and conferencing system
US6466263B1 (en) * 1993-04-28 2002-10-15 Olympus Optical Co., Ltd. Electronic still camera having pointing indicator showing operation mode
US5745161A (en) * 1993-08-30 1998-04-28 Canon Kabushiki Kaisha Video conference system
US6037936A (en) * 1993-09-10 2000-03-14 Criticom Corp. Computer vision system with a graphic user interface and remote camera control
EP0644694B1 (en) * 1993-09-20 2000-04-26 Canon Kabushiki Kaisha Video System
JPH07135594A (en) * 1993-11-11 1995-05-23 Canon Inc Image pickup controller
US5574934A (en) * 1993-11-24 1996-11-12 Intel Corporation Preemptive priority-based transmission of signals using virtual channels
US5524110A (en) * 1993-11-24 1996-06-04 Intel Corporation Conferencing over multiple transports
US7865567B1 (en) 1993-12-02 2011-01-04 Discovery Patent Holdings, Llc Virtual on-demand electronic book
US8095949B1 (en) 1993-12-02 2012-01-10 Adrea, LLC Electronic book with restricted access features
US7861166B1 (en) 1993-12-02 2010-12-28 Discovery Patent Holding, Llc Resizing document pages to fit available hardware screens
US9053640B1 (en) 1993-12-02 2015-06-09 Adrea, LLC Interactive electronic book
US6476868B1 (en) * 1994-04-11 2002-11-05 Canon Kabushiki Kaisha Image pickup apparatus provided with enlargement process means for enlarging image signals output from an image pickup device
JP3797678B2 (en) * 1994-05-26 2006-07-19 富士通株式会社 Window and camera cooperative control method and apparatus
JP3658036B2 (en) * 1994-06-03 2005-06-08 キヤノン株式会社 Image input system and method for controlling image input system
US5835140A (en) * 1994-06-27 1998-11-10 Matsushita Electric Industrial Co., Ltd. Remote-control method and apparatus for rotating image device
US6707484B1 (en) * 1994-07-28 2004-03-16 Semiconductor Energy Laboratory Co., Ltd. Information processing system
US5802281A (en) 1994-09-07 1998-09-01 Rsi Systems, Inc. Peripheral audio/video communication system that interfaces with a host computer and determines format of coded audio/video signals
JP3491990B2 (en) * 1994-10-26 2004-02-03 キヤノン株式会社 Document camera device
US5767897A (en) * 1994-10-31 1998-06-16 Picturetel Corporation Video conferencing system
TW250616B (en) * 1994-11-07 1995-07-01 Discovery Communicat Inc Electronic book selection and delivery system
JP3335017B2 (en) * 1994-11-29 2002-10-15 キヤノン株式会社 Camera device control device
US5661785A (en) * 1994-12-22 1997-08-26 Lucent Technologies Inc. Flexible telecommunications line interface
US6195122B1 (en) * 1995-01-31 2001-02-27 Robert Vincent Spatial referenced photography
JP3265893B2 (en) * 1995-02-13 2002-03-18 株式会社日立製作所 Image display device
JP2981408B2 (en) * 1995-02-16 1999-11-22 日本電気システム建設株式会社 Method and apparatus for controlling high-speed introduction of a target object in a camera image
JPH08223561A (en) * 1995-02-17 1996-08-30 Nippon Denki Syst Kensetsu Kk Method and device for controlling camera by the use of monitor area diagram
US5854898A (en) 1995-02-24 1998-12-29 Apple Computer, Inc. System for automatically adding additional data stream to existing media connection between two end points upon exchange of notifying and confirmation messages therebetween
US7079177B2 (en) * 1995-02-27 2006-07-18 Canon Kabushiki Kaisha Remote control system and access control method for information input apparatus with limitation by user for image access and camemremote control
US5657246A (en) * 1995-03-07 1997-08-12 Vtel Corporation Method and apparatus for a video conference user interface
WO1996027983A1 (en) 1995-03-07 1996-09-12 Interval Research Corporation System and method for selective recording of information
JPH08256318A (en) * 1995-03-17 1996-10-01 Fujitsu Ltd Camera controller for video conference system
JP3272185B2 (en) * 1995-04-07 2002-04-08 キヤノン株式会社 Display device
JP3315555B2 (en) * 1995-04-07 2002-08-19 キヤノン株式会社 Camera control device
US6122005A (en) * 1995-04-14 2000-09-19 Canon Kabushiki Kaisha Camera control system having list of camera names updated in accordance with frequency of use and other ease of use features
US5793415A (en) * 1995-05-15 1998-08-11 Imagetel International Inc. Videoconferencing and multimedia system
US5742845A (en) 1995-06-22 1998-04-21 Datascape, Inc. System for extending present open network communication protocols to communicate with non-standard I/O devices directly coupled to an open network
US5689800A (en) * 1995-06-23 1997-11-18 Intel Corporation Video feedback for reducing data rate or increasing quality in a video processing system
US5926209A (en) * 1995-07-14 1999-07-20 Sensormatic Electronics Corporation Video camera apparatus with compression system responsive to video camera adjustment
US6731334B1 (en) * 1995-07-31 2004-05-04 Forgent Networks, Inc. Automatic voice tracking camera system and method of operation
JPH0946591A (en) * 1995-07-31 1997-02-14 Canon Inc Image processing unit and image processing system
US6008837A (en) * 1995-10-05 1999-12-28 Canon Kabushiki Kaisha Camera control apparatus and method
US5963250A (en) * 1995-10-20 1999-10-05 Parkervision, Inc. System and method for controlling the field of view of a camera
US5706049A (en) * 1995-11-30 1998-01-06 Eastman Kodak Company Camera that records an active image area identifier with an image
DE69632384T2 (en) * 1995-12-19 2005-05-04 Canon K.K. Apparatus and method for controlling a plurality of remote cameras
US6314140B1 (en) * 1995-12-28 2001-11-06 Lucent Technologies Inc. Dynamic video focus control
JP3996960B2 (en) * 1996-01-30 2007-10-24 キヤノン株式会社 Camera control system
JP3696999B2 (en) * 1996-01-30 2005-09-21 キヤノン株式会社 Camera control system and camera control device
JP3573309B2 (en) * 1996-02-06 2004-10-06 ソニー株式会社 Monitoring device and method
US5719622A (en) * 1996-02-23 1998-02-17 The Regents Of The University Of Michigan Visual control selection of remote mechanisms
JPH09238385A (en) * 1996-02-29 1997-09-09 Victor Co Of Japan Ltd Remote control method for house appliance
US5841992A (en) * 1996-03-25 1998-11-24 Snap-On Tools Company Network-to-serial device intelligent converter
US6112083A (en) * 1996-03-27 2000-08-29 Amsc Subsidiary Corporation Full service dispatcher for satellite trunked radio service system
US6189034B1 (en) * 1996-05-08 2001-02-13 Apple Computer, Inc. Method and apparatus for dynamic launching of a teleconferencing application upon receipt of a call
US6295549B1 (en) 1996-05-08 2001-09-25 Apple Computer, Inc. Method and apparatus for listening for incoming calls on multiple port/socket combinations
US5959667A (en) * 1996-05-09 1999-09-28 Vtel Corporation Voice activated camera preset selection system and method of operation
US5946470A (en) * 1996-05-15 1999-08-31 Intel Corporation Method and apparatus for voltage level shifting and deskewing the inputs to a high performance microprocessor
US5805812A (en) * 1996-05-15 1998-09-08 Electronic Data Systems Corporation Communication system for the remote control of equipment
JPH104531A (en) * 1996-06-14 1998-01-06 Nikon Corp Information processor
JP3839881B2 (en) * 1996-07-22 2006-11-01 キヤノン株式会社 Imaging control apparatus and control method thereof
US5956523A (en) * 1996-08-09 1999-09-21 Advantech Co., Ltd. Method and apparatus for reducing the number of RS232/RS485 transmission converters required for communicating between a PC and a plurality of instruments
US5903302A (en) * 1996-10-04 1999-05-11 Datapoint Corporation Automated video call distribution
US6476854B1 (en) * 1996-10-18 2002-11-05 Compaq Information Technologies Group, L.P. Video eavesdropping and reverse assembly to transmit video action to a remote console
KR19990076722A (en) * 1996-10-24 1999-10-15 이데이 노부유끼 Camera device
JP3943674B2 (en) * 1996-10-25 2007-07-11 キヤノン株式会社 Camera control system, camera server and control method thereof
US5874944A (en) * 1996-11-13 1999-02-23 Vlsi Technology, Inc. Variable voltage detector power-up and power-down circuit for a joystick interface
US5893062A (en) 1996-12-05 1999-04-06 Interval Research Corporation Variable rate video playback with synchronized audio
US6263507B1 (en) 1996-12-05 2001-07-17 Interval Research Corporation Browser for use in navigating a body of information, with particular application to browsing information represented by audiovisual data
US7054271B2 (en) 1996-12-06 2006-05-30 Ipco, Llc Wireless network system and method for providing same
US8982856B2 (en) 1996-12-06 2015-03-17 Ipco, Llc Systems and methods for facilitating wireless network communication, satellite-based wireless network systems, and aircraft-based wireless network systems, and related methods
US6181381B1 (en) * 1996-12-13 2001-01-30 Ncr Corporation Camera lens within pivoting hinge in portable electronic device
US6275258B1 (en) * 1996-12-17 2001-08-14 Nicholas Chim Voice responsive image tracking system
JP3869897B2 (en) * 1997-01-28 2007-01-17 キヤノン株式会社 Camera control system, video receiving apparatus, control method, and storage medium
US6356303B1 (en) * 1997-02-05 2002-03-12 Canon Kabushiki Kaisha Camera control system with zoom lens control based on transmission time
US6061055A (en) * 1997-03-21 2000-05-09 Autodesk, Inc. Method of tracking objects with an imaging device
US6275252B1 (en) * 1997-03-25 2001-08-14 Sony Corporation Method and apparatus for improving video conferencing video capture by distance adaptive filtering
US5898459A (en) * 1997-03-26 1999-04-27 Lectrolarm Custom Systems, Inc. Multi-camera programmable pan-and-tilt apparatus
EP1021917A4 (en) 1997-03-31 2002-05-15 Broadband Associates Method and system for providing a presentation on a network
US7412533B1 (en) 1997-03-31 2008-08-12 West Corporation Providing a presentation on a network having a plurality of synchronized media types
US7490169B1 (en) 1997-03-31 2009-02-10 West Corporation Providing a presentation on a network having a plurality of synchronized media types
US7143177B1 (en) 1997-03-31 2006-11-28 West Corporation Providing a presentation on a network having a plurality of synchronized media types
US6624846B1 (en) * 1997-07-18 2003-09-23 Interval Research Corporation Visual user interface for use in controlling the interaction of a device with a spatial region
JP3575957B2 (en) * 1997-07-28 2004-10-13 シャープ株式会社 Region extraction method for videophone and videophone terminal
JP3581560B2 (en) * 1997-07-29 2004-10-27 キヤノン株式会社 Camera control system, computer terminal, control method thereof, and storage medium storing program for executing the control
US7071971B2 (en) * 1997-08-25 2006-07-04 Elbex Video Ltd. Apparatus for identifying the scene location viewed via remotely operated television camera
KR100260936B1 (en) 1997-08-30 2000-07-01 윤종용 Apparatus and a method of debuging multifunctional product
EP1309194A1 (en) * 1997-09-04 2003-05-07 Discovery Communications, Inc. Apparatus for video access and control over computer network, including image correction
US7054916B2 (en) * 1997-12-08 2006-05-30 Sanyo Electric Co., Ltd. Imaging apparatus and network system using the same
JP3649883B2 (en) * 1997-12-08 2005-05-18 三洋電機株式会社 Imaging apparatus and network system
US6380968B1 (en) * 1998-01-06 2002-04-30 Intel Corporation Method and apparatus for controlling a remote video camera in a video conferencing system
JP4109739B2 (en) * 1998-01-19 2008-07-02 キヤノン株式会社 CAMERA CONTROL DEVICE, CAMERA CONTROL SYSTEM, CAMERA CONTROL DEVICE CONTROL METHOD, AND STORAGE MEDIUM
US6819345B1 (en) * 1998-02-17 2004-11-16 Microsoft Corporation Managing position and size for a desktop component
US6208378B1 (en) * 1998-02-23 2001-03-27 Netergy Networks Video arrangement with remote activation of appliances and remote playback of locally captured video data
US6346962B1 (en) 1998-02-27 2002-02-12 International Business Machines Corporation Control of video conferencing system with pointing device
US6178204B1 (en) * 1998-03-30 2001-01-23 Intel Corporation Adaptive control of video encoder's bit allocation based on user-selected region-of-interest indication feedback from video decoder
CN1178467C (en) * 1998-04-16 2004-12-01 三星电子株式会社 Method and apparatus for automatically tracing moving object
JPH11306119A (en) 1998-04-17 1999-11-05 Minolta Co Ltd Network system
US7355621B1 (en) 1998-06-10 2008-04-08 Fernandez Dennis S Digital television with subscriber conference overlay
US6339842B1 (en) 1998-06-10 2002-01-15 Dennis Sunga Fernandez Digital television with subscriber conference overlay
JP2000002913A (en) * 1998-06-15 2000-01-07 Matsushita Electric Ind Co Ltd Monitor camera rotation controller
US6218953B1 (en) 1998-10-14 2001-04-17 Statsignal Systems, Inc. System and method for monitoring the light level around an ATM
US8410931B2 (en) 1998-06-22 2013-04-02 Sipco, Llc Mobile inventory unit monitoring systems and methods
US6028522A (en) * 1998-10-14 2000-02-22 Statsignal Systems, Inc. System for monitoring the light level around an ATM
US6891838B1 (en) 1998-06-22 2005-05-10 Statsignal Ipc, Llc System and method for monitoring and controlling residential devices
US6914893B2 (en) 1998-06-22 2005-07-05 Statsignal Ipc, Llc System and method for monitoring and controlling remote devices
US6437692B1 (en) 1998-06-22 2002-08-20 Statsignal Systems, Inc. System and method for monitoring and controlling remote devices
US6847334B2 (en) 1998-06-29 2005-01-25 William Hayhurst Mobile telecommunication device for simultaneously transmitting and receiving sound and image data
JP4032404B2 (en) * 1998-07-10 2008-01-16 フジノン株式会社 Operating device
US6212632B1 (en) 1998-07-31 2001-04-03 Flashpoint Technology, Inc. Method and system for efficiently reducing the RAM footprint of software executing on an embedded computer system
US6704048B1 (en) * 1998-08-27 2004-03-09 Polycom, Inc. Adaptive electronic zoom control
EP1033879A4 (en) * 1998-09-18 2004-05-19 Mitsubishi Electric Corp Camera control system
US20020013679A1 (en) * 1998-10-14 2002-01-31 Petite Thomas D. System and method for monitoring the light level in a lighted area
US6226696B1 (en) 1998-10-29 2001-05-01 Fairchild Semiconductor Corporation Programmable circuit for sensing computer pointing devices to sink a different amount of current depending on the requirements of the device
US6201562B1 (en) 1998-10-31 2001-03-13 Kar-Wing E. Lor Internet protocol video phone adapter for high bandwidth data access
US7057636B1 (en) 1998-12-22 2006-06-06 Koninklijke Philips Electronics N.V. Conferencing system and method for the automatic determination of preset positions corresponding to participants in video-mediated communications
US7966078B2 (en) 1999-02-01 2011-06-21 Steven Hoffberg Network media appliance system and method
US7650425B2 (en) 1999-03-18 2010-01-19 Sipco, Llc System and method for controlling communication between a host computer and communication devices associated with remote devices in an automated monitoring system
US6532218B1 (en) * 1999-04-05 2003-03-11 Siemens Information & Communication Networks, Inc. System and method for multimedia collaborative conferencing
JP4250802B2 (en) * 1999-04-16 2009-04-08 フジノン株式会社 Remote head system
JP4209535B2 (en) * 1999-04-16 2009-01-14 パナソニック株式会社 Camera control device
AU5727800A (en) * 1999-06-07 2000-12-28 Iviewit Holdings, Inc. System and method for video playback over a network
US8212893B2 (en) * 1999-06-08 2012-07-03 Verisign, Inc. Digital camera device and methodology for distributed processing and wireless transmission of digital images
US6260023B1 (en) 1999-06-14 2001-07-10 Ncr Corporation Transaction processing system including a networked produce recognition system
US6888565B1 (en) * 1999-08-31 2005-05-03 Canon Kabushiki Kaisha Apparatus and method for remote-controlling image sensing apparatus in image sensing system
JP2001069496A (en) * 1999-08-31 2001-03-16 Matsushita Electric Ind Co Ltd Supervisory camera apparatus and control method for supervisory camera
US6992702B1 (en) 1999-09-07 2006-01-31 Fuji Xerox Co., Ltd System for controlling video and motion picture cameras
US7155735B1 (en) 1999-10-08 2006-12-26 Vulcan Patents Llc System and method for the broadcast dissemination of time-ordered data
US6625812B2 (en) 1999-10-22 2003-09-23 David Hardin Abrams Method and system for preserving and communicating live views of a remote physical location over a computer network
US7299405B1 (en) 2000-03-08 2007-11-20 Ricoh Company, Ltd. Method and system for information management to facilitate the exchange of ideas during a collaborative effort
US7653925B2 (en) 1999-11-17 2010-01-26 Ricoh Company, Ltd. Techniques for receiving information during multimedia presentations and communicating the information
EP1309901B1 (en) * 1999-12-02 2008-05-21 Western Digital Technologies, Inc. System for remote recording of television programs
US8793374B2 (en) * 1999-12-02 2014-07-29 Western Digital Technologies, Inc. Managed peer-to-peer applications, systems and methods for distributed data access and storage
US8688797B2 (en) * 1999-12-02 2014-04-01 Western Digital Technologies, Inc. Managed peer-to-peer applications, systems and methods for distributed data access and storage
US7120692B2 (en) 1999-12-02 2006-10-10 Senvid, Inc. Access and control system for network-enabled devices
WO2001045406A1 (en) * 1999-12-17 2001-06-21 Telum (Canada) Inc. Network-based talk show system
US7542068B2 (en) * 2000-01-13 2009-06-02 Polycom, Inc. Method and system for controlling multimedia video communication
US6757682B1 (en) 2000-01-28 2004-06-29 Interval Research Corporation Alerting users to items of current interest
US20040189804A1 (en) * 2000-02-16 2004-09-30 Borden George R. Method of selecting targets and generating feedback in object tracking systems
US6980526B2 (en) * 2000-03-24 2005-12-27 Margalla Communications, Inc. Multiple subscriber videoconferencing system
CA2405526A1 (en) * 2000-04-03 2001-10-11 Anthony V. Pugliese, Iii System and method for displaying and selling goods and services
US7382397B2 (en) * 2000-07-26 2008-06-03 Smiths Detection, Inc. Systems and methods for controlling devices over a network
US20030093430A1 (en) * 2000-07-26 2003-05-15 Mottur Peter A. Methods and systems to control access to network devices
WO2002009060A2 (en) * 2000-07-26 2002-01-31 Livewave, Inc. Methods and systems for networked camera control
US7151562B1 (en) * 2000-08-03 2006-12-19 Koninklijke Philips Electronics N.V. Method and apparatus for external calibration of a camera via a graphical user interface
US20020015003A1 (en) * 2000-08-07 2002-02-07 Masami Kato Virtual space system structured by plural user terminals and server device
JP2002112098A (en) * 2000-10-03 2002-04-12 Olympus Optical Co Ltd Electronic image pickup apparatus
US7698450B2 (en) * 2000-11-17 2010-04-13 Monroe David A Method and apparatus for distributing digitized streaming video over a network
CA2328795A1 (en) 2000-12-19 2002-06-19 Advanced Numerical Methods Ltd. Applications and performance enhancements for detail-in-context viewing technology
US20020130955A1 (en) * 2001-01-12 2002-09-19 Daniel Pelletier Method and apparatus for determining camera movement control criteria
JP2002251608A (en) * 2001-02-23 2002-09-06 Mixed Reality Systems Laboratory Inc Device and method for controlling imaging device, device and method for image processing, program code, and storage medium
US6965394B2 (en) * 2001-03-30 2005-11-15 Koninklijke Philips Electronics N.V. Remote camera control device
WO2002084590A1 (en) * 2001-04-11 2002-10-24 Applied Minds, Inc. Knowledge web
US6930702B1 (en) * 2001-04-11 2005-08-16 Applied Minds, Inc. Device for positioning and operating audio and video equipment
US8416266B2 (en) 2001-05-03 2013-04-09 Noregin Assetts N.V., L.L.C. Interacting with detail-in-context presentations
CA2345803A1 (en) 2001-05-03 2002-11-03 Idelix Software Inc. User interface elements for pliable display technology implementations
US20020167617A1 (en) * 2001-05-11 2002-11-14 Vornsand Steven J. Closed loop television control system
JP4802397B2 (en) * 2001-05-30 2011-10-26 コニカミノルタホールディングス株式会社 Image photographing system and operation device
JP2002369073A (en) * 2001-06-04 2002-12-20 Toshiba Corp Mobile wireless terminal
WO2002101534A1 (en) 2001-06-12 2002-12-19 Idelix Software Inc. Graphical user interface with zoom for detail-in-context presentations
US9760235B2 (en) 2001-06-12 2017-09-12 Callahan Cellular L.L.C. Lens-defined adjustment of displays
US7084886B2 (en) 2002-07-16 2006-08-01 Idelix Software Inc. Using detail-in-context lenses for accurate digital image cropping and measurement
GB0116877D0 (en) * 2001-07-10 2001-09-05 Hewlett Packard Co Intelligent feature selection and pan zoom control
US20030023700A1 (en) * 2001-07-27 2003-01-30 Lightsurf Technologies, Inc. System and methodology providing on-board user interface
US7082270B2 (en) * 2001-10-01 2006-07-25 Xerox Corporation Machine optimization methodology
US8489063B2 (en) 2001-10-24 2013-07-16 Sipco, Llc Systems and methods for providing emergency messages to a mobile device
US7480501B2 (en) 2001-10-24 2009-01-20 Statsignal Ipc, Llc System and method for transmitting an emergency message over an integrated wireless network
US6980485B2 (en) * 2001-10-25 2005-12-27 Polycom, Inc. Automatic camera tracking using beamforming
US7424527B2 (en) 2001-10-30 2008-09-09 Sipco, Llc System and method for transmitting pollution information over an integrated wireless network
CA2361341A1 (en) 2001-11-07 2003-05-07 Idelix Software Inc. Use of detail-in-context presentation on stereoscopically paired images
WO2003063484A1 (en) * 2002-01-16 2003-07-31 Polycom, Inc. Method and system for controlling multimedia video communication
US7724281B2 (en) * 2002-02-04 2010-05-25 Syniverse Icx Corporation Device facilitating efficient transfer of digital content from media capture device
CA2370752A1 (en) * 2002-02-05 2003-08-05 Idelix Software Inc. Fast rendering of pyramid lens distorted raster images
US20070285504A1 (en) * 2002-02-15 2007-12-13 Hesse Thomas H Systems and methods for conferencing among governed and external participants
US20050084086A1 (en) * 2002-02-15 2005-04-21 Hesse Thomas H. Systems and methods for conferencing among governed and external participants
US7046779B2 (en) * 2002-02-15 2006-05-16 Multimedia Telesys, Inc. Video conference system and methods for use at multi-station sites
US20030195834A1 (en) * 2002-04-10 2003-10-16 Hillis W. Daniel Automated online purchasing system
US7844610B2 (en) * 2003-12-12 2010-11-30 Google Inc. Delegated authority evaluation system
US8069175B2 (en) * 2002-04-10 2011-11-29 Google Inc. Delegating authority to evaluate content
CA2386560A1 (en) * 2002-05-15 2003-11-15 Idelix Software Inc. Controlling optical hardware and dynamic data viewing systems with detail-in-context viewing tools
US20040070616A1 (en) * 2002-06-02 2004-04-15 Hildebrandt Peter W. Electronic whiteboard
US6880987B2 (en) * 2002-06-21 2005-04-19 Quickset International, Inc. Pan and tilt positioning unit
JP2004032372A (en) * 2002-06-26 2004-01-29 Fuji Photo Film Co Ltd Image data processing method, portable terminal device and program
US8120624B2 (en) 2002-07-16 2012-02-21 Noregin Assets N.V. L.L.C. Detail-in-context lenses for digital image cropping, measurement and online maps
CA2393887A1 (en) 2002-07-17 2004-01-17 Idelix Software Inc. Enhancements to user interface for detail-in-context data presentation
FR2842687B1 (en) * 2002-07-19 2005-02-04 France Telecom VISIOPHONIC STATION AND METHOD FOR VISIOPHONIC RELATIONSHIP
US6925357B2 (en) * 2002-07-25 2005-08-02 Intouch Health, Inc. Medical tele-robotic system
US20040162637A1 (en) 2002-07-25 2004-08-19 Yulun Wang Medical tele-robotic system with a master remote station with an arbitrator
FR2843516B1 (en) * 2002-08-12 2004-12-24 France Telecom METHOD FOR REAL-TIME BROADCAST OF MULTIMEDIA FILES DURING A VIDEO CONFERENCE, WITHOUT COMMUNICATION BREAK, AND MAN-MACHINE INTERFACE FOR IMPLEMENTATION
AU2003268402A1 (en) * 2002-09-10 2004-04-30 Sigcom, Inc. Software architecture system for a security management system
CA2406131A1 (en) 2002-09-30 2004-03-30 Idelix Software Inc. A graphical user interface using detail-in-context folding
CA2449888A1 (en) 2003-11-17 2005-05-17 Idelix Software Inc. Navigating large images using detail-in-context fisheye rendering techniques
US20070097109A1 (en) * 2005-10-18 2007-05-03 Idelix Software Inc. Method and system for generating detail-in-context presentations in client/server systems
CA2411898A1 (en) 2002-11-15 2004-05-15 Idelix Software Inc. A method and system for controlling access to detail-in-context presentations
AU2003297193A1 (en) 2002-12-13 2004-07-09 Applied Minds, Inc. Meta-web
US8012025B2 (en) * 2002-12-13 2011-09-06 Applied Minds, Llc Video game controller hub with control input reduction and combination schemes
US6686794B1 (en) * 2002-12-19 2004-02-03 Intel Corporation Differential charge pump
US6747506B1 (en) * 2002-12-20 2004-06-08 Intel Corporation Charge pump architecture
US7446797B2 (en) * 2003-02-10 2008-11-04 Activeye, Inc. User assisted customization of automated video surveillance systems
JP4136712B2 (en) * 2003-02-25 2008-08-20 キヤノン株式会社 Imaging control device and imaging system
JP4474106B2 (en) * 2003-02-27 2010-06-02 キヤノン株式会社 Image processing apparatus, image processing method, recording medium, and program
US20040180689A1 (en) * 2003-03-14 2004-09-16 Logicacmg Wireless Networks, Inc. Systems and methods for establishing communication between a first wireless terminal and a second wireless terminal differing in respect to at least one feature
US7559026B2 (en) * 2003-06-20 2009-07-07 Apple Inc. Video conferencing system having focus control
US7397495B2 (en) * 2003-06-20 2008-07-08 Apple Inc. Video conferencing apparatus and method
KR20050000276A (en) * 2003-06-24 2005-01-03 주식회사 성진씨앤씨 Virtual joystick system for controlling the operation of a security camera and controlling method thereof
US7995090B2 (en) * 2003-07-28 2011-08-09 Fuji Xerox Co., Ltd. Video enabled tele-presence control host
US8228377B2 (en) * 2003-09-12 2012-07-24 Logitech Europe S.A. Pan and tilt camera
US7689712B2 (en) 2003-11-26 2010-03-30 Ricoh Company, Ltd. Techniques for integrating note-taking and multimedia information
US7813836B2 (en) 2003-12-09 2010-10-12 Intouch Technologies, Inc. Protocol for a remotely controlled videoconferencing robot
US20050131918A1 (en) * 2003-12-12 2005-06-16 W. Daniel Hillis Personalized profile for evaluating content
KR100617702B1 (en) * 2004-01-13 2006-08-28 삼성전자주식회사 Portable terminal capable of editing image and image edition method using that
IL159838A0 (en) 2004-01-13 2004-06-20 Yehuda Binder Information device
JP3847753B2 (en) 2004-01-30 2006-11-22 株式会社ソニー・コンピュータエンタテインメント Image processing apparatus, image processing method, recording medium, computer program, semiconductor device
US7366995B2 (en) * 2004-02-03 2008-04-29 Roland Wescott Montague Combination tool that zooms in, zooms out, pans, rotates, draws, or manipulates during a drag
CA2472871C (en) 2004-02-18 2011-10-25 Inter-Cite Video Inc. System and method for the automated, remote diagnostic of the operation of a digital video recording network
US7756086B2 (en) 2004-03-03 2010-07-13 Sipco, Llc Method for communicating in dual-modes
US8031650B2 (en) 2004-03-03 2011-10-04 Sipco, Llc System and method for monitoring remote devices with a dual-mode wireless communication protocol
US7126816B2 (en) 2004-03-12 2006-10-24 Apple Computer, Inc. Camera latch
US7486302B2 (en) 2004-04-14 2009-02-03 Noregin Assets N.V., L.L.C. Fisheye lens graphical user interfaces
JP4337614B2 (en) * 2004-04-26 2009-09-30 カシオ計算機株式会社 Electronic camera and program
US8106927B2 (en) 2004-05-28 2012-01-31 Noregin Assets N.V., L.L.C. Graphical user interfaces and occlusion prevention for fisheye lenses with line segment foci
US8644525B2 (en) * 2004-06-02 2014-02-04 Clearone Communications, Inc. Virtual microphones in electronic conferencing systems
US7864937B2 (en) * 2004-06-02 2011-01-04 Clearone Communications, Inc. Common control of an electronic multi-pod conferencing system
US7916849B2 (en) * 2004-06-02 2011-03-29 Clearone Communications, Inc. Systems and methods for managing the gating of microphones in a multi-pod conference system
US8031853B2 (en) * 2004-06-02 2011-10-04 Clearone Communications, Inc. Multi-pod conference systems
US9317945B2 (en) 2004-06-23 2016-04-19 Callahan Cellular L.L.C. Detail-in-context lenses for navigation
NO323527B1 (en) * 2004-07-01 2007-06-04 Tandberg Telecom As Monitoring and control of management systems
US8077963B2 (en) 2004-07-13 2011-12-13 Yulun Wang Mobile robot with a head-based movement mapping scheme
US7623156B2 (en) * 2004-07-16 2009-11-24 Polycom, Inc. Natural pan tilt zoom camera motion to preset camera positions
JP4716083B2 (en) * 2004-07-27 2011-07-06 ソニー株式会社 Information processing apparatus and method, recording medium, and program
JP2006041886A (en) * 2004-07-27 2006-02-09 Sony Corp Information processor and method, recording medium, and program
JP4733942B2 (en) * 2004-08-23 2011-07-27 株式会社日立国際電気 Camera system
US7714859B2 (en) 2004-09-03 2010-05-11 Shoemaker Garth B D Occlusion reduction and magnification for multidimensional data presentations
JP2006084999A (en) * 2004-09-17 2006-03-30 Fujinon Corp Af-area control system
US7995078B2 (en) 2004-09-29 2011-08-09 Noregin Assets, N.V., L.L.C. Compound lenses for multi-source data presentation
US7917935B2 (en) * 2004-10-01 2011-03-29 Logitech Europe S.A. Mechanical pan, tilt and zoom in a webcam
US7717629B2 (en) * 2004-10-15 2010-05-18 Lifesize Communications, Inc. Coordinated camera pan tilt mechanism
US8054336B2 (en) * 2004-10-15 2011-11-08 Lifesize Communications, Inc. High definition pan tilt zoom camera with embedded microphones and thin cable for data and power
US7602141B2 (en) * 2004-10-15 2009-10-13 Lifesize Communications, Inc. Battery operated speakerphone and charging stand
US7473040B2 (en) * 2004-10-15 2009-01-06 Lifesize Communications, Inc. High definition camera pan tilt mechanism
US8149739B2 (en) * 2004-10-15 2012-04-03 Lifesize Communications, Inc. Background call validation
US7545435B2 (en) * 2004-10-15 2009-06-09 Lifesize Communications, Inc. Automatic backlight compensation and exposure control
US7572073B2 (en) * 2004-10-15 2009-08-11 Lifesize Communications, Inc. Camera support mechanism
US20060106929A1 (en) * 2004-10-15 2006-05-18 Kenoyer Michael L Network conference communications
US7667728B2 (en) * 2004-10-15 2010-02-23 Lifesize Communications, Inc. Video and audio conferencing system with spatial audio
US8339447B2 (en) * 2004-10-21 2012-12-25 Truevision Systems, Inc. Stereoscopic electronic microscope workstation
JP4770178B2 (en) * 2005-01-17 2011-09-14 ソニー株式会社 Camera control apparatus, camera system, electronic conference system, and camera control method
US7599989B2 (en) * 2005-01-24 2009-10-06 Microsoft Corporation System and method for gathering and reporting screen resolutions of attendees of a collaboration session
WO2006081206A1 (en) 2005-01-25 2006-08-03 Sipco, Llc Wireless network protocol systems and methods
US8274534B2 (en) * 2005-01-31 2012-09-25 Roland Wescott Montague Methods for combination tools that zoom, pan, rotate, draw, or manipulate during a drag
JP4641424B2 (en) * 2005-02-02 2011-03-02 キヤノン株式会社 Imaging device
US7710452B1 (en) 2005-03-16 2010-05-04 Eric Lindberg Remote video monitoring of non-urban outdoor sites
TWI306213B (en) * 2005-03-24 2009-02-11 Via Tech Inc Display mode management system and method
US8457614B2 (en) * 2005-04-07 2013-06-04 Clearone Communications, Inc. Wireless multi-unit conference phone
US7580036B2 (en) * 2005-04-13 2009-08-25 Catherine Montagnese Detail-in-context terrain displacement algorithm with optimizations
US20060232550A1 (en) * 2005-04-15 2006-10-19 Buckner Nathan C Integrated mouse in remote control
US7750944B2 (en) * 2005-05-02 2010-07-06 Ge Security, Inc. Methods and apparatus for camera operation
US20060248210A1 (en) * 2005-05-02 2006-11-02 Lifesize Communications, Inc. Controlling video display mode in a video conferencing system
US7945938B2 (en) * 2005-05-11 2011-05-17 Canon Kabushiki Kaisha Network camera system and control method therefore
US7554576B2 (en) * 2005-06-20 2009-06-30 Ricoh Company, Ltd. Information capture and recording system for controlling capture devices
US8805929B2 (en) * 2005-06-20 2014-08-12 Ricoh Company, Ltd. Event-driven annotation techniques
FI20055369A0 (en) * 2005-06-30 2005-06-30 Nokia Corp Method and device for processing digital media files
US9198728B2 (en) 2005-09-30 2015-12-01 Intouch Technologies, Inc. Multi-camera mobile teleconferencing platform
US8031206B2 (en) 2005-10-12 2011-10-04 Noregin Assets N.V., L.L.C. Method and system for generating pyramid fisheye lens detail-in-context presentations
US8358330B2 (en) * 2005-10-21 2013-01-22 True Vision Systems, Inc. Stereoscopic electronic microscope workstation
US20070116458A1 (en) * 2005-11-18 2007-05-24 Mccormack Kenneth Methods and systems for operating a pan tilt zoom camera
US20070171275A1 (en) * 2006-01-24 2007-07-26 Kenoyer Michael L Three Dimensional Videoconferencing
US8125508B2 (en) 2006-01-24 2012-02-28 Lifesize Communications, Inc. Sharing participant information in a videoconference
US8120638B2 (en) * 2006-01-24 2012-02-21 Lifesize Communications, Inc. Speech to text conversion in a videoconference
US8531275B2 (en) * 2006-02-02 2013-09-10 The Directv Group, Inc. Remote control mode on-screen displays and methods for producing the same
US7704617B2 (en) * 2006-04-03 2010-04-27 Bloom Energy Corporation Hybrid reformer for fuel flexibility
US7983473B2 (en) 2006-04-11 2011-07-19 Noregin Assets, N.V., L.L.C. Transparency adjustment of a presentation
US7710450B2 (en) 2006-04-20 2010-05-04 Cisco Technology, Inc. System and method for dynamic control of image capture in a video conference system
US8952974B2 (en) * 2006-04-20 2015-02-10 Cisco Technology, Inc. Latency reduction in a display device
US7558823B2 (en) * 2006-05-31 2009-07-07 Hewlett-Packard Development Company, L.P. System and method for managing virtual collaboration systems
US20070291104A1 (en) * 2006-06-07 2007-12-20 Wavetronex, Inc. Systems and methods of capturing high-resolution images of objects
CN101087398B (en) * 2006-06-08 2010-05-12 中兴通讯股份有限公司 A method and device for realizing remote monitoring in conference TV system
US8849679B2 (en) 2006-06-15 2014-09-30 Intouch Technologies, Inc. Remote controlled robot system that provides medical images
US20070291128A1 (en) * 2006-06-15 2007-12-20 Yulun Wang Mobile teleconferencing system that projects an image provided by a mobile robot
TW200801933A (en) * 2006-06-22 2008-01-01 Univ Nat Central Positioning apparatus and method for a remote control camera combined with electronic map
US7667762B2 (en) * 2006-08-01 2010-02-23 Lifesize Communications, Inc. Dual sensor video camera
US20080063389A1 (en) * 2006-09-13 2008-03-13 General Instrument Corporation Tracking a Focus Point by a Remote Camera
GB0619850D0 (en) * 2006-10-06 2006-11-15 Vitec Group Plc The Camera control interface
AU2007221976B2 (en) * 2006-10-19 2009-12-24 Polycom, Inc. Ultrasonic camera tracking system and associated methods
KR101306706B1 (en) * 2006-11-09 2013-09-11 엘지전자 주식회사 Auto install apparatus and Method for AV Device connection with digital TV
FR2910770A1 (en) * 2006-12-22 2008-06-27 France Telecom Videoconference device for e.g. TV, has light source illuminating eyes of local user viewing screen during communication with remote user, such that image sensor captures local user's image with reflection of light source on eyes
US20080171458A1 (en) * 2007-01-17 2008-07-17 Pelco, Inc. Apparatus for facilitating video connections to surveillance devices
WO2008103418A2 (en) * 2007-02-22 2008-08-28 Roy Sandberg Method and apparatus for panning, tilting, and adjusting the height of a remotely controlled camera
TW200836563A (en) * 2007-02-27 2008-09-01 Awind Inc Pointing control system for multi-site presentation conference
US9160783B2 (en) 2007-05-09 2015-10-13 Intouch Technologies, Inc. Robot system that operates through a network firewall
US8237765B2 (en) * 2007-06-22 2012-08-07 Lifesize Communications, Inc. Video conferencing device which performs multi-way conferencing
NO327899B1 (en) 2007-07-13 2009-10-19 Tandberg Telecom As Procedure and system for automatic camera control
US8139100B2 (en) 2007-07-13 2012-03-20 Lifesize Communications, Inc. Virtual multiway scaler compensation
US9026938B2 (en) 2007-07-26 2015-05-05 Noregin Assets N.V., L.L.C. Dynamic detail-in-context user interface for application access and content access on electronic displays
CA2698281C (en) 2007-07-30 2016-08-16 Twenty20, Inc. Components of a portable digital video camera
US9661267B2 (en) * 2007-09-20 2017-05-23 Lifesize, Inc. Videoconferencing system discovery
US20090099923A1 (en) * 2007-09-24 2009-04-16 Koenig Jesse D Spacecraft advertisement systems and methods
KR101187909B1 (en) * 2007-10-04 2012-10-05 삼성테크윈 주식회사 Surveillance camera system
CN101448142A (en) * 2007-11-27 2009-06-03 鸿富锦精密工业(深圳)有限公司 Image tracking device and image tracking method thereof
US20090102919A1 (en) * 2007-12-31 2009-04-23 Zamierowski David S Audio-video system and method for telecommunications
US10875182B2 (en) 2008-03-20 2020-12-29 Teladoc Health, Inc. Remote presence system mounted to operating room hardware
CN102007516A (en) * 2008-04-14 2011-04-06 汤姆森特许公司 Technique for automatically tracking an object
US8179418B2 (en) 2008-04-14 2012-05-15 Intouch Technologies, Inc. Robotic based health care system
US8170241B2 (en) 2008-04-17 2012-05-01 Intouch Technologies, Inc. Mobile tele-presence system with a microphone system
GB0807744D0 (en) * 2008-04-29 2008-06-04 Smith Howard Camera control systems
US9193065B2 (en) 2008-07-10 2015-11-24 Intouch Technologies, Inc. Docking system for a tele-presence robot
US9842192B2 (en) 2008-07-11 2017-12-12 Intouch Technologies, Inc. Tele-presence robot system with multi-cast features
US8340819B2 (en) 2008-09-18 2012-12-25 Intouch Technologies, Inc. Mobile videoconferencing robot system with network adaptive driving
JP5410720B2 (en) 2008-09-25 2014-02-05 日立コンシューマエレクトロニクス株式会社 Digital information signal transmitting / receiving apparatus and digital information signal transmitting / receiving method
US8996165B2 (en) 2008-10-21 2015-03-31 Intouch Technologies, Inc. Telepresence robot with a camera boom
US20100110160A1 (en) * 2008-10-30 2010-05-06 Brandt Matthew K Videoconferencing Community with Live Images
US20100123785A1 (en) * 2008-11-17 2010-05-20 Apple Inc. Graphic Control for Directional Audio Input
US9138891B2 (en) 2008-11-25 2015-09-22 Intouch Technologies, Inc. Server connectivity control for tele-presence robot
US8463435B2 (en) 2008-11-25 2013-06-11 Intouch Technologies, Inc. Server connectivity control for tele-presence robot
US8488001B2 (en) * 2008-12-10 2013-07-16 Honeywell International Inc. Semi-automatic relative calibration method for master slave camera control
US8849680B2 (en) * 2009-01-29 2014-09-30 Intouch Technologies, Inc. Documentation through a remote presence robot
US8456510B2 (en) * 2009-03-04 2013-06-04 Lifesize Communications, Inc. Virtual distributed multipoint control unit
US8643695B2 (en) * 2009-03-04 2014-02-04 Lifesize Communications, Inc. Videoconferencing endpoint extension
US8380866B2 (en) * 2009-03-20 2013-02-19 Ricoh Company, Ltd. Techniques for facilitating annotations
US8897920B2 (en) 2009-04-17 2014-11-25 Intouch Technologies, Inc. Tele-presence robot system with software modularity, projector and laser pointer
JP5322287B2 (en) * 2009-05-28 2013-10-23 パナソニック株式会社 Camera device with turntable
US20100302403A1 (en) * 2009-06-02 2010-12-02 Raytheon Company Generating Images With Different Fields Of View
JP5453953B2 (en) * 2009-06-24 2014-03-26 ソニー株式会社 Movable mechanism control device, movable mechanism control method, program
US8305421B2 (en) * 2009-06-29 2012-11-06 Lifesize Communications, Inc. Automatic determination of a configuration for a conference
US8723988B2 (en) * 2009-07-17 2014-05-13 Sony Corporation Using a touch sensitive display to control magnification and capture of digital images by an electronic device
FR2948808B1 (en) * 2009-07-30 2012-08-03 Dassault Aviat DIGITAL DISPLAY DEVICE, IN PARTICULAR FOR THE PREPARATION OF A PATH
US11399153B2 (en) 2009-08-26 2022-07-26 Teladoc Health, Inc. Portable telepresence apparatus
US8384755B2 (en) 2009-08-26 2013-02-26 Intouch Technologies, Inc. Portable remote presence robot
NO332170B1 (en) 2009-10-14 2012-07-16 Cisco Systems Int Sarl Camera control device and method
US8350891B2 (en) * 2009-11-16 2013-01-08 Lifesize Communications, Inc. Determining a videoconference layout based on numbers of participants
KR20110055244A (en) * 2009-11-19 2011-05-25 삼성전자주식회사 Digital photographing apparatus, method for controlling the same, and recording medium storing program to execute the method
US8719112B2 (en) * 2009-11-24 2014-05-06 Microsoft Corporation Invocation of accessory-specific user experience
US7865629B1 (en) * 2009-11-24 2011-01-04 Microsoft Corporation Configurable connector for system-level communication
US9495697B2 (en) * 2009-12-10 2016-11-15 Ebay Inc. Systems and methods for facilitating electronic commerce over a network
US11154981B2 (en) 2010-02-04 2021-10-26 Teladoc Health, Inc. Robot user interface for telepresence robot system
US8670017B2 (en) 2010-03-04 2014-03-11 Intouch Technologies, Inc. Remote presence system including a cart that supports a robot face and an overhead camera
US9294716B2 (en) * 2010-04-30 2016-03-22 Alcatel Lucent Method and system for controlling an imaging system
US10343283B2 (en) 2010-05-24 2019-07-09 Intouch Technologies, Inc. Telepresence robot system that can be accessed by a cellular phone
US10808882B2 (en) 2010-05-26 2020-10-20 Intouch Technologies, Inc. Tele-robotic system with a robot face placed on a chair
US8963987B2 (en) 2010-05-27 2015-02-24 Microsoft Corporation Non-linguistic signal detection and feedback
US8670018B2 (en) * 2010-05-27 2014-03-11 Microsoft Corporation Detecting reactions and providing feedback to an interaction
US8331760B2 (en) 2010-06-02 2012-12-11 Microsoft Corporation Adaptive video zoom
US8848054B2 (en) * 2010-07-29 2014-09-30 Crestron Electronics Inc. Presentation capture with automatically configurable output
CN103210639B (en) 2010-09-13 2019-05-07 康道尔知识产权控股有限责任公司 It is configured to the portable digital video video camera of remote image collection control and viewing
US8174931B2 (en) 2010-10-08 2012-05-08 HJ Laboratories, LLC Apparatus and method for providing indoor location, position, or tracking of a mobile computer using building information
US9264664B2 (en) 2010-12-03 2016-02-16 Intouch Technologies, Inc. Systems and methods for dynamic bandwidth allocation
US12093036B2 (en) 2011-01-21 2024-09-17 Teladoc Health, Inc. Telerobotic system with a dual application screen presentation
EP2668008A4 (en) 2011-01-28 2018-01-24 Intouch Technologies, Inc. Interfacing with a mobile telepresence robot
US9323250B2 (en) 2011-01-28 2016-04-26 Intouch Technologies, Inc. Time-dependent navigation of telepresence robots
US8791911B2 (en) 2011-02-09 2014-07-29 Robotzone, Llc Multichannel controller
CN102143323A (en) * 2011-03-31 2011-08-03 郑明� Novel camera controller
US10769739B2 (en) 2011-04-25 2020-09-08 Intouch Technologies, Inc. Systems and methods for management of information among medical providers and facilities
US20140139616A1 (en) 2012-01-27 2014-05-22 Intouch Technologies, Inc. Enhanced Diagnostics for a Telepresence Robot
US9098611B2 (en) 2012-11-26 2015-08-04 Intouch Technologies, Inc. Enhanced video interaction for a user interface of a telepresence network
US8736661B2 (en) * 2011-06-03 2014-05-27 Adobe Systems Incorporated Device information index and retrieval service for scalable video conferencing
US9390617B2 (en) * 2011-06-10 2016-07-12 Robotzone, Llc Camera motion control system with variable autonomy
US8836751B2 (en) 2011-11-08 2014-09-16 Intouch Technologies, Inc. Tele-presence system with a user interface that displays different communication links
US8860777B2 (en) * 2011-12-22 2014-10-14 Verizon Patent And Licensing Inc. Multi-enterprise video conference service
US8902278B2 (en) 2012-04-11 2014-12-02 Intouch Technologies, Inc. Systems and methods for visualizing and managing telepresence devices in healthcare networks
US9251313B2 (en) 2012-04-11 2016-02-02 Intouch Technologies, Inc. Systems and methods for visualizing and managing telepresence devices in healthcare networks
US9361021B2 (en) 2012-05-22 2016-06-07 Irobot Corporation Graphical user interfaces including touchpad driving interfaces for telemedicine devices
WO2013176758A1 (en) 2012-05-22 2013-11-28 Intouch Technologies, Inc. Clinical workflows utilizing autonomous and semi-autonomous telemedicine devices
US9473740B2 (en) * 2012-10-24 2016-10-18 Polycom, Inc. Automatic positioning of videoconference camera to presenter at presentation device
US9232176B2 (en) * 2013-03-04 2016-01-05 Janus Technologies, Inc. Method and apparatus for securing computer video and audio subsystems
US9462225B2 (en) 2013-03-15 2016-10-04 Zeller Digital Innovations, Inc. Presentation systems and related methods
US9930293B2 (en) 2013-03-15 2018-03-27 Zeller Digital Innovations, Inc. Presentation systems and related methods
US20150022674A1 (en) * 2013-07-18 2015-01-22 Koss Corporation Wireless video camera
JP6269014B2 (en) * 2013-12-13 2018-01-31 ソニー株式会社 Focus control device and focus control method
US9726463B2 (en) 2014-07-16 2017-08-08 Robtozone, LLC Multichannel controller for target shooting range
JP6392600B2 (en) * 2014-09-12 2018-09-19 古野電気株式会社 Operating device and cursor movement control method
JP6887745B2 (en) * 2014-12-26 2021-06-16 キヤノンメディカルシステムズ株式会社 Medical image display device
US9667859B1 (en) 2015-12-28 2017-05-30 Gopro, Inc. Systems and methods for determining preferences for capture settings of an image capturing device
US9681097B1 (en) 2016-01-20 2017-06-13 Global Tel*Link Corporation Secure video visitation system
US9967457B1 (en) 2016-01-22 2018-05-08 Gopro, Inc. Systems and methods for determining preferences for capture settings of an image capturing device
US10296994B2 (en) 2016-02-11 2019-05-21 Global Tel*Link Corporation System and method for visitation management in a controlled environment
US9558523B1 (en) 2016-03-23 2017-01-31 Global Tel* Link Corp. Secure nonscheduled video visitation system
CN107547790A (en) * 2016-06-27 2018-01-05 中兴通讯股份有限公司 The processing method of IMAQ, apparatus and system
US10560636B1 (en) * 2017-03-24 2020-02-11 Cisco Technology, Inc. Crop synchronized auto focus and exposure
US11862302B2 (en) 2017-04-24 2024-01-02 Teladoc Health, Inc. Automated transcription and documentation of tele-health encounters
US10483007B2 (en) 2017-07-25 2019-11-19 Intouch Technologies, Inc. Modular telehealth cart with thermal imaging and touch screen user interface
US11636944B2 (en) 2017-08-25 2023-04-25 Teladoc Health, Inc. Connectivity infrastructure for a telehealth platform
US11122240B2 (en) 2017-09-11 2021-09-14 Michael H Peters Enhanced video conference management
US10382722B1 (en) 2017-09-11 2019-08-13 Michael H. Peters Enhanced video conference management
US11785180B2 (en) 2017-09-11 2023-10-10 Reelay Meetings, Inc. Management and analysis of related concurrent communication sessions
US11290686B2 (en) 2017-09-11 2022-03-29 Michael H Peters Architecture for scalable video conference management
US10617299B2 (en) 2018-04-27 2020-04-14 Intouch Technologies, Inc. Telehealth cart that supports a removable tablet with seamless audio/video switching
GB2608016B (en) * 2018-10-03 2023-06-07 Cmr Surgical Ltd Feature identification
JP2021052325A (en) * 2019-09-25 2021-04-01 キヤノン株式会社 Image capture device, system, method for controlling image capture device, and program
US11036375B1 (en) * 2020-02-20 2021-06-15 Lenovo (Singapore) Pte. Ltd. Dynamic zoom based on media
US11651541B2 (en) * 2021-03-01 2023-05-16 Roblox Corporation Integrated input/output (I/O) for a three-dimensional (3D) environment

Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4244006A (en) * 1975-07-18 1981-01-06 Nippon Hoso Kyokai Control device for television camera
JPS6196879A (en) * 1984-10-17 1986-05-15 Canon Inc Controller of television camera and the like
JPS6390970A (en) * 1986-10-04 1988-04-21 Canon Inc Focus controller
JPS63276371A (en) * 1987-05-08 1988-11-14 Hitachi Ltd Method and apparatus for adjusting picture angle of video camera
JPH01114171A (en) * 1987-10-27 1989-05-02 Matsushita Electric Works Ltd Camera driving device
JPH01288069A (en) * 1988-05-13 1989-11-20 Maruwa Denshi Kagaku Kk Remote controller for television camera turning base
JPH02211780A (en) * 1989-02-10 1990-08-23 Sharp Corp Video telephone system
US5049988A (en) * 1988-02-12 1991-09-17 Pearpoint Limited Scanning T.V. camera
JPH04196774A (en) * 1990-11-28 1992-07-16 Hitachi Ltd Video camera device
GB2252473A (en) * 1991-09-17 1992-08-05 Radamec Epo Limited Remote control system for robotic camera
JPH0575995A (en) * 1991-09-17 1993-03-26 Canon Inc Terminal device for video conference
EP0539695A2 (en) * 1991-09-05 1993-05-05 Canon Kabushiki Kaisha TV conference system and terminal equipment for use in the same
US5218627A (en) * 1990-12-19 1993-06-08 U S West Advanced Technologies Decentralized video telecommunication system
JPH05153458A (en) * 1991-11-30 1993-06-18 Nec Corp Remote control monitor television camera system
JPH05176217A (en) * 1991-12-24 1993-07-13 Sony Corp Pan tilter for video camera
JPH05207458A (en) * 1992-01-29 1993-08-13 Sony Corp Controller
JPH05236319A (en) * 1992-02-25 1993-09-10 Fuji Photo Optical Co Ltd Telecamera action controller using panhead
JPH05268508A (en) * 1992-03-17 1993-10-15 Fuji Photo Optical Co Ltd Remote controller for television camera

Family Cites Families (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3984628A (en) * 1975-01-13 1976-10-05 Paul Grayson Sharp Remote camera-position control
US4425664A (en) * 1975-11-26 1984-01-10 Bell Telephone Laboratories, Incorporated Multiport programmable digital data set
NL7810655A (en) * 1977-10-26 1979-05-01 Rank Organisation Ltd DEVICE FOR GENERATING CONTROL SIGNALS.
US4589063A (en) * 1983-08-04 1986-05-13 Fortune Systems Corporation Data processing system having automatic configuration
JPS61140281A (en) * 1984-12-12 1986-06-27 Oki Electric Ind Co Ltd Rotation controlling system of industrial television camera
JPH0654953B2 (en) * 1985-04-25 1994-07-20 ソニー株式会社 Electronic device controller
JPS63284990A (en) * 1987-05-18 1988-11-22 Daiwa Seisakusho:Kk Tracing device for video camera
JPS6433473A (en) * 1987-07-30 1989-02-03 Hitachi Ltd Low-temperature refrigerator
JPS6490092A (en) * 1987-09-30 1989-04-05 Suido Kiko Kk Method for removing trihalomethane precursor in water
JPH027775A (en) * 1988-06-27 1990-01-11 Nec Corp Video conference camera control system
DE3823219C1 (en) * 1988-07-08 1989-05-18 Telenorma Telefonbau Und Normalzeit Gmbh, 6000 Frankfurt, De
JPH0378373A (en) * 1989-08-22 1991-04-03 Fuji Photo Optical Co Ltd Television camera operating device
JPH03109879A (en) * 1989-09-25 1991-05-09 Canon Inc Zoom controller
US5161169A (en) * 1990-05-15 1992-11-03 Codex Corporation Dcd with reprogramming instructions contained in removable cartridge
US5230059A (en) * 1990-07-16 1993-07-20 Kraft Systems, Inc. Software - configurable adaptive computer interface
CA2058704C (en) * 1991-01-04 1997-11-25 Michael John Camille Marsh Communication system_with addressable functional modules.
JP2930257B2 (en) * 1991-04-22 1999-08-03 株式会社東芝 Portable electronic devices
AU2010192A (en) * 1991-05-21 1992-12-30 Videotelecom Corp. A multiple medium message recording system
US5236199A (en) * 1991-06-13 1993-08-17 Thompson Jr John W Interactive media system and telecomputing method using telephone keypad signalling
US5202899A (en) * 1991-08-16 1993-04-13 Rockwell International Corporation Apparatus for providing dynamic selection of modem protocol to support multiple modem types
JPH05183577A (en) * 1991-12-26 1993-07-23 Nec Corp Packet communication system
JPH05207451A (en) * 1992-01-29 1993-08-13 Sony Corp Video conference system
JPH05207453A (en) * 1992-01-29 1993-08-13 Sony Corp Video conference system
WO1994007327A1 (en) * 1992-09-21 1994-03-31 Rolm Company Method and apparatus for on-screen camera control in video-conference equipment
US5440632A (en) * 1992-12-02 1995-08-08 Scientific-Atlanta, Inc. Reprogrammable subscriber terminal
US5689553A (en) * 1993-04-22 1997-11-18 At&T Corp. Multimedia telecommunications network and service
JPH06313951A (en) * 1993-04-28 1994-11-08 Mitsubishi Paper Mills Ltd Image formation

Patent Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4244006A (en) * 1975-07-18 1981-01-06 Nippon Hoso Kyokai Control device for television camera
JPS6196879A (en) * 1984-10-17 1986-05-15 Canon Inc Controller of television camera and the like
JPS6390970A (en) * 1986-10-04 1988-04-21 Canon Inc Focus controller
JPS63276371A (en) * 1987-05-08 1988-11-14 Hitachi Ltd Method and apparatus for adjusting picture angle of video camera
JPH01114171A (en) * 1987-10-27 1989-05-02 Matsushita Electric Works Ltd Camera driving device
US5049988A (en) * 1988-02-12 1991-09-17 Pearpoint Limited Scanning T.V. camera
JPH01288069A (en) * 1988-05-13 1989-11-20 Maruwa Denshi Kagaku Kk Remote controller for television camera turning base
JPH02211780A (en) * 1989-02-10 1990-08-23 Sharp Corp Video telephone system
JPH04196774A (en) * 1990-11-28 1992-07-16 Hitachi Ltd Video camera device
US5218627A (en) * 1990-12-19 1993-06-08 U S West Advanced Technologies Decentralized video telecommunication system
EP0539695A2 (en) * 1991-09-05 1993-05-05 Canon Kabushiki Kaisha TV conference system and terminal equipment for use in the same
GB2252473A (en) * 1991-09-17 1992-08-05 Radamec Epo Limited Remote control system for robotic camera
JPH0575995A (en) * 1991-09-17 1993-03-26 Canon Inc Terminal device for video conference
JPH05153458A (en) * 1991-11-30 1993-06-18 Nec Corp Remote control monitor television camera system
JPH05176217A (en) * 1991-12-24 1993-07-13 Sony Corp Pan tilter for video camera
JPH05207458A (en) * 1992-01-29 1993-08-13 Sony Corp Controller
JPH05236319A (en) * 1992-02-25 1993-09-10 Fuji Photo Optical Co Ltd Telecamera action controller using panhead
JPH05268508A (en) * 1992-03-17 1993-10-15 Fuji Photo Optical Co Ltd Remote controller for television camera

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP0724809A4 *

Cited By (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0713331A1 (en) * 1994-11-17 1996-05-22 Canon Kabushiki Kaisha Camera control device
US6452628B2 (en) 1994-11-17 2002-09-17 Canon Kabushiki Kaisha Camera control and display device using graphical user interface
US6266085B1 (en) 1994-11-17 2001-07-24 Canon Kabushiki Kaisha Camera imaging and magnification device
US5751296A (en) * 1995-10-19 1998-05-12 Fujitsu Limited Video presentation system
DE19624151C2 (en) * 1995-10-19 1999-04-01 Fujitsu Ltd Video presentation system
DE19624151A1 (en) * 1995-10-19 1997-04-24 Fujitsu Ltd Video presentation system
US7219365B2 (en) 1996-07-23 2007-05-15 Canon Kabushiki Kaisha Apparatus and method for controlling a camera connected to a network
US7298399B2 (en) 1996-07-23 2007-11-20 Canon Kabushiki Kaisha Apparatus and method for controlling a camera connected to a network
US6484195B1 (en) 1996-07-23 2002-11-19 Canon Kabushiki Kaisha Server, terminal and control methods for transmitting real-time images over the internet
US6525761B2 (en) 1996-07-23 2003-02-25 Canon Kabushiki Kaisha Apparatus and method for controlling a camera connected to a network
EP0821522A2 (en) * 1996-07-23 1998-01-28 Canon Kabushiki Kaisha Camera control apparatus and method
EP0821522A3 (en) * 1996-07-23 2001-09-12 Canon Kabushiki Kaisha Camera control apparatus and method
US6469737B1 (en) 1996-07-23 2002-10-22 Canon Kabushiki Kaisha Image-sensing server and control method and storage medium therefor
EP0944259A2 (en) * 1998-03-16 1999-09-22 plettac AG Control of camera movement
EP0944259A3 (en) * 1998-03-16 2000-08-02 plettac AG Control of camera movement
EP2148496A1 (en) * 1998-05-01 2010-01-27 Canon Kabushiki Kaisha Image communication apparatus and its control method
WO2000013417A1 (en) * 1998-08-31 2000-03-09 France Telecom Automatic system for sound and image recording
FR2782877A1 (en) * 1998-08-31 2000-03-03 France Telecom AUTOMATIC SOUND AND IMAGE SYSTEM
WO2003013140A1 (en) * 2001-07-25 2003-02-13 Stevenson Neil J A camera control apparatus and method
GB2393350A (en) * 2001-07-25 2004-03-24 Neil J Stevenson A camera control apparatus and method
GB2393350B (en) * 2001-07-25 2006-03-08 Neil J Stevenson A camera control apparatus and method
EP1428377A1 (en) * 2001-07-27 2004-06-16 Honeywell Limited A control system for allowing an operator to proportionally control a work piece
WO2003013128A1 (en) 2001-07-27 2003-02-13 Honeywell Limited A control system for allowing an operator to proportionally control a work piece
US7868919B2 (en) 2001-07-27 2011-01-11 Honeywell Limited Control system for allowing an operator to proportionally control a work piece
EP1428377A4 (en) * 2001-07-27 2006-11-29 Honeywell Ltd A control system for allowing an operator to proportionally control a work piece
KR100989660B1 (en) 2003-03-13 2010-10-26 프랑스 텔레콤 Control method and system for a remote video chain
US7557840B2 (en) 2003-03-13 2009-07-07 France Telecom Control method and system for a remote video chain
FR2852473A1 (en) * 2003-03-13 2004-09-17 France Telecom Remote video processing network control process for use in videophonic telecommunication, involves execution of modification command on video flow by video processing network before transmitting it to terminal e.g. server
EP1465425A1 (en) * 2003-03-13 2004-10-06 France Telecom Process and system for controlling a remote video channel
EP1830567A4 (en) * 2004-12-21 2011-05-18 Zte Corp A method for upgrading software in the teleconference video terminal
US8879721B2 (en) 2006-03-01 2014-11-04 Sony Corporation Audio communication system
US9774788B2 (en) 2007-02-16 2017-09-26 Axis Ab Providing area zoom functionality for a camera
EP2066116A3 (en) * 2007-11-28 2010-03-24 Sony Corporation Imaging apparatus and method, information processing apparatus and method, and recording medium storing a program therefor
US8284268B2 (en) 2007-11-28 2012-10-09 Sony Corporation Imaging apparatus and method, information processing apparatus and method, and recording medium storing a program therefor
CN101448098B (en) * 2007-11-28 2014-05-28 索尼株式会社 Imaging apparatus and method, and information processing apparatus and method
WO2011030097A1 (en) * 2009-09-11 2011-03-17 The Vitec Group Plc Camera system control and interface

Also Published As

Publication number Publication date
AU7921094A (en) 1995-05-08
CA2174336A1 (en) 1995-04-27
EP0724809A1 (en) 1996-08-07
US5526037A (en) 1996-06-11
SG67927A1 (en) 1999-10-19
US5589878A (en) 1996-12-31
US5598209A (en) 1997-01-28
US5583565A (en) 1996-12-10
EP0724809A4 (en) 1996-08-28
US5528289A (en) 1996-06-18
US5515099A (en) 1996-05-07
JPH09506217A (en) 1997-06-17
US5568183A (en) 1996-10-22
CN1135823A (en) 1996-11-13

Similar Documents

Publication Publication Date Title
US5568183A (en) Network videoconferencing system
US10462350B2 (en) Camera control apparatus and camera control method
EP0884909B1 (en) Camera control system
US6002995A (en) Apparatus and method for displaying control information of cameras connected to a network
US6580451B2 (en) Communication apparatus, image processing apparatus, communication method, and image processing method
US6646677B2 (en) Image sensing control method and apparatus, image transmission control method, apparatus, and system, and storage means storing program that implements the method
JP3581560B2 (en) Camera control system, computer terminal, control method thereof, and storage medium storing program for executing the control
KR100392727B1 (en) A computer-based remote surveillance CCTV system, a computer video matrix switcher and a control program adapted to the CCTV system
EP0715453A2 (en) Camera controller
US20100045817A1 (en) Camera control apparatus and camera control method
JPH07135594A (en) Image pickup controller
EP0893919B1 (en) Camera control system
JP2003125365A (en) Controlling device, program, and recording medium
JP2011205573A (en) Control device, camera system, and program
US20090175501A1 (en) Imaging control apparatus and imaging control method
EP0776130A2 (en) Camera control system with variable frame rate
JP3604766B2 (en) Camera control device and camera control method
US20100118165A1 (en) Video signal processing apparatus
KR100815234B1 (en) GUI apparatus and method for camera control device
JP2001268556A (en) Remote monitoring system and control terminal equipment
JPH09266548A (en) System, device, and method for camera control, and memory
JPH0530507A (en) Video conference system
JP2638098B2 (en) Head system
JPH09181952A (en) Camera control system and image input device
JPS60150389A (en) Remote supervisory system

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 94194258.9

Country of ref document: CN

AK Designated states

Kind code of ref document: A1

Designated state(s): AM AT AU BB BG BR BY CA CH CN CZ DE DK EE ES FI GB GE HU JP KE KG KP KR KZ LK LR LT LU LV MD MG MN MW NL NO NZ PL PT RO RU SD SE SI SK TJ TT UA UZ VN

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): KE MW SD SZ AT BE CH DE DK ES FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN ML MR NE SN TD TG

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2174336

Country of ref document: CA

WWE Wipo information: entry into national phase

Ref document number: 1994929914

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 1994929914

Country of ref document: EP

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

WWW Wipo information: withdrawn in national office

Ref document number: 1994929914

Country of ref document: EP