WO2011030097A1 - Interface et contrôle d'un système de caméras - Google Patents

Interface et contrôle d'un système de caméras Download PDF

Info

Publication number
WO2011030097A1
WO2011030097A1 PCT/GB2010/001697 GB2010001697W WO2011030097A1 WO 2011030097 A1 WO2011030097 A1 WO 2011030097A1 GB 2010001697 W GB2010001697 W GB 2010001697W WO 2011030097 A1 WO2011030097 A1 WO 2011030097A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
camera system
spatial
interface
controllable
Prior art date
Application number
PCT/GB2010/001697
Other languages
English (en)
Inventor
Richard Arthur Lindsay
Philip Christopher Dalgoutte
Original Assignee
The Vitec Group Plc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by The Vitec Group Plc filed Critical The Vitec Group Plc
Publication of WO2011030097A1 publication Critical patent/WO2011030097A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • H04N23/661Transmitting camera control signals through networks, e.g. control via the Internet

Definitions

  • the present invention relates to a method of
  • controlling a camera system a method for monitoring spatial information for a camera system and a method for interfacing a camera system with a camera system controller and camera system interface. It further relates to a system for
  • a device interface for communicating a position or movement of an associated electrical device that forms part of a camera system and an integrated device interface are also considered.
  • a typical studio of this type comprises one or more camera systems, which are each made up of multiple electrical devices, such as a pedestal, pan and tilt head and a camera. Each device may be capable of movement across a limited range within a number of degrees of freedom. The position or movement of each device is controlled by providing electrical signals. Alternatively or additionally, a device may have other controllable
  • parameters such as the focus or zoom of the camera.
  • Changing the parameters for any device may affect the camera image obtained.
  • robotic camera studios other control options are used.
  • Each device is typically controlled using a specific, respective protocol.
  • Control commands are issued from a central controller.
  • the controller may be designed to control a single camera system or a whole studio.
  • the system comprises: server interface 10; server 20; screen 22; keyboard 24; mouse 26; input interface 28; video switcher 30; Ethernet hub 40; camera system interfaces 50; and camera systems 60.
  • Each camera system 60 comprises: pedestal 62; pan and tilt head 65; and lens controller 68, each of which have multiple components.
  • Commands for controlling each camera system 60 are generated at the server 20 based on user input. These commands include instructions for multiple components of multiple parts of a camera system 60, for example generating separate instructions for controlling each of the different axes of movement for the pan and tilt head 65.
  • the commands are communicated through server interface 10 and Ethernet hub 40 to a camera system interface 50.
  • the camera system interface 50 then forwards these commands to the respective device in the camera system, each component of the
  • One camera system interface 50 also sends commands to the components of two different camera systems.
  • the present invention provides a method of controlling a camera system, monitoring a camera system or both.
  • a method of controlling a camera system for capturing camera images comprising a plurality of controllable devices each of which has a respective device control protocol.
  • the method comprises: determining, at a camera system
  • controller an instruction to be sent to control the spatial configuration of one or more of the plurality of
  • controllable devices communicating a network signal from the camera system controller across a shared communication channel to a plurality of device interfaces using a common communication protocol that is independent from the device control protocol of each of the plurality of controllable devices, the network signal identifying a plurality of spatial parameters in combination based on the determined instruction, each spatial parameter being representative of a spatial position or movement within a respective degree of freedom; identifying, at a first device interface from the plurality of device interfaces, that the received network signal relates to a controllable device associated with the first device interface, based upon the received network signal; and generating a control signal at the first device interface, based upon the received instruction, so as to control the spatial configuration of the associated
  • controllable device for capture of camera images by the camera system.
  • control of the devices to yield a defined camera trajectory can be determined and effected for a specific combination and configuration of devices.
  • the camera system provides a communications and processing architecture having the flexibility and scalability to control camera trajectory in mountings comprising variable numbers, types and
  • a simple command can be used to control any device within a camera system.
  • This provides a control architecture that is readily scalable, because adding an additional device to the system requires only an additional device interface in terms of hardware. Thus, scalability and flexibility are improved.
  • the device interface is integrated with the
  • controllable device the device can be added to the camera system even more easily.
  • the plurality of spatial parameters may identify a vector or spatial matrix representation.
  • the network signal comprises six spatial parameters. Then, each spatial parameter may identify a spatial position or
  • the camera system includes a camera and the method further comprises: capturing the camera images using the camera; and controlling the configuration of the associated controllable device using the generated control signal, so as to cause a change in the captured camera images .
  • the method further comprises: acquiring a topology for the plurality of controllable devices at the camera system controller.
  • the camera system controller may receive information about each of the
  • controllable devices and their physical relationship with respect to one another. This information may be programmed or it may be received electrically or electronically from one or more controllable devices. In such embodiments, the step of determining an instruction to control the
  • controllable devices may be based on the acquired topology for the plurality of controllable devices.
  • the instructions sent to each device are made with an awareness of the relative spatial constraints on each device .
  • the network signal further comprises a device identifier that identifies one of the plurality of controllable devices. Then, the step of identifying is based on the device identifier in the received network signal.
  • the method further comprises:
  • each device may determine its own
  • the shared communication channel may be a packet-based routed network (such as Ethernet) .
  • a packet-based routed network such as Ethernet
  • an Internet Protocol-based network may be used.
  • the control signal is preferably a first control signal. Then, the method further comprises generating a second control signal at the first device interface, based upon the received instruction, so as to control the
  • the device interface is able to control separately
  • At least one of the plurality of controllable devices comprises a camera.
  • the camera is not controlled by the camera system controller.
  • the common control protocol may comprise transmission of a camera parameter, the camera parameter being representative of a configuration for the camera.
  • the camera parameter is representative of a zoom configuration of the camera.
  • the camera parameter is representative of a focus configuration of the camera.
  • the method further comprises transmitting configuration information from the first device interface to the camera system controller across the shared communication channel, the configuration information
  • each device can inform the camera system controller about its relevant constraints, in
  • At least one controllable device from the plurality of controllable devices may be one of: a pan and tilt head; a camera pedestal; or a crane.
  • the present invention resides in a method of controlling a camera studio comprising a first camera system for capturing camera images and a second camera system for capturing camera images.
  • the method comprises: controlling the first camera system according to the method as previously described using a first camera system controller; and controlling the second camera system according to the method as previously described using a second camera system controller.
  • the shared communication channel for the first camera system and the shared communication channel for the second camera system are the same.
  • the first camera system controller and the second camera system controller are the same.
  • the method further comprises communicating an instruction for control of the first camera system from a studio system controller to the first camera system
  • the present invention provides a method for monitoring spatial information for a camera system, the camera system comprising a plurality of
  • the method comprises: detecting spatial information, comprising a position or movement, using a first transducer of a first electrical device from the plurality of electrical devices; communicating the detected spatial information from the first transducer to a device interface associated with the first electrical device using the respective device
  • the network signal identifying the plurality of spatial parameters; and determining spatial information at the camera system interface based upon the received plurality of spatial parameters and a configuration for the plurality of electrical devices.
  • control of the camera system can be made more scalable and flexible by using a common
  • the monitoring or feedback functions within a camera system that direct information in the reverse direction than the control functions can also be made more scalable and
  • the device interface is thereby able to combine parameters received from multiple transducers (such as sensors) into a common parameter data for the device.
  • the camera system interface is able to combine parameter data from individual devices into parameter information for the camera system.
  • the detected parameter comprises a position or
  • the network signal identifies a position or movement.
  • the network signal identifies six spatial parameters, each spatial parameter being
  • the position or movement relates to the camera system.
  • the position or movement may relate to an object upon which a camera of the camera system is focussed.
  • the network signal further comprises a device identifier that identifies the first electrical device.
  • the shared communication channel may be a packet-based routed network (such as Ethernet) .
  • an Internet Protocol-based network may be used.
  • the first transducer detects a first parameter and the method further comprises: detecting a second parameter using a second transducer of the first electrical device; and communicating the detected second parameter from the second transducer to the device interface using the respective device communications
  • the step of generating the parameter data comprises combining the first detected parameter and the second detected parameter. This combination is particularly advantageous when the detected parameter comprises a
  • the device interface may allow the device interface to combine data from different axes in a vector formulation.
  • An electrical device may have one transducer or
  • a first transducer may detect a first parameter and a second transducer may detect a second parameter.
  • At least one of the plurality of electrical devices comprises a camera.
  • the method then further comprises: communicating camera
  • the network signal identifying a camera parameter representative of the camera information.
  • the camera parameter is representative of a zoom configuration of the camera.
  • the camera parameter is representative of a focus configuration of the camera.
  • the present invention may reside in a method for monitoring parameter information for a first camera system and a second camera system.
  • the method comprises: monitoring parameter information for the first camera system according to the further aspect of the present invention; and monitoring parameter information for the second camera system according to the further aspect of the present invention.
  • the shared image may reside in a method for monitoring parameter information for a first camera system and a second camera system.
  • the method comprises: monitoring parameter information for the first camera system according to the further aspect of the present invention; and monitoring parameter information for the second camera system according to the further aspect of the present invention.
  • the method further comprises communicating a position or
  • a method for interfacing a camera system with a camera system controller and camera system interface comprises: the method for monitoring parameter information for the camera system according to the further aspect of the present invention; and the method for
  • the method further comprises: receiving a command at the camera system controller indicating a trajectory for the camera of the camera system.
  • the step of determining an instruction to control the configuration of one or more of the plurality of controllable devices is based on the determined position or movement and on the received command.
  • a system for controlling the capture of camera images using a camera system comprising: a camera system, comprising a plurality of controllable devices each of which has a configuration which can be controlled using a
  • each device interface being associated with a controllable device; and a camera system controller, arranged to determine an instruction to be sent to control the configuration of one or more of the plurality of
  • controllable devices and to communicate a network signal across a shared communication channel to the plurality of device interfaces using a common communication protocol that is independent from the device control protocol of each of the plurality of controllable devices, the network signal comprising the instruction for controlling the configuration of the plurality of controllable devices.
  • Each of the plurality of device interfaces is configured to identify that the received network signal relates to a controllable device associated with the respective device interface, based upon the received network signal, and to generate a control signal, based upon the received instruction, so as to control the configuration of the associated controllable device for capture of camera images by the camera system.
  • the plurality of controllable devices includes a camera arranged to capture camera images.
  • Another different aspect of the present invention provides a system for monitoring parameter information for a camera system, comprising: a camera system interface; a plurality of device interfaces, each which is arranged to communicate a network signal across a shared communication channel to the camera system interface using a common communication protocol; and a camera system, comprising a plurality of electrical devices each of which has a
  • the common communication protocol is independent from the device communication protocol of each of the plurality of electrical devices, the network signal
  • the camera system interface is further arranged to determine parameter information based upon the received detected parameter and a configuration for the plurality of electrical devices.
  • the detected parameter is optionally a position or movement for the camera system.
  • a device interface for controlling an associated controllable device that forms part of a camera system
  • the device interface comprising: a receiver, arranged to receive a network signal from a shared communication channel, the network signal comprising a spatial identifier
  • each spatial parameter identifying a spatial position or movement within a degree of freedom
  • a detector arranged to identify that the received network signal relates to the associated controllable device
  • an output arranged to generate a control signal for controlling the associated controllable device on the basis of the received spatial identifier.
  • a further alternative aspect of the present invention may reside in a device interface for communicating a
  • an input arranged to receive an indicator signal, the indicator signal being indicative of the
  • a processor arranged to determine a spatial identifier on the basis of the indicator signal, the spatial identifier being representative of a plurality of spatial parameters, each spatial parameter identifying a spatial position or movement within a degree of freedom; and a transmitter, arranged to transmit a network signal comprising the spatial identifier over a shared communication channel.
  • An integrated device interface may be provided, comprising: the device interface according to the
  • a method of communicating between a camera system for capturing camera images and a camera system interface comprising a plurality of electrical devices each of which has a respective device communication protocol.
  • the method comprises: communicating a network signal between the camera system controller and a device interface across a shared communication channel, using a common communication protocol that is independent from the device communication protocol of each of the plurality of electrical devices; and identifying an
  • Figure 1 shows an existing arrangement for controlling a plurality of camera systems
  • FIG. 1 shows an abstraction of an interface
  • Figure 3 illustrates an exemplary architecture for controlling a robotic camera system based on Figure 2;
  • Figure 4 depicts an exemplary architecture for
  • Figure 5 shows exemplary hardware for the camera system interface of Figures 2 to 4;
  • Figure 6 illustrates exemplary hardware for one of the device interfaces shown in Figures 2 to 4;
  • Figure 7 depicts an abstraction of an interface
  • Figure 8 shows a schematic of a simple arrangement for controlling a studio with two camera systems based on the architecture of Figures 2 to 7;
  • Figure 9 illustrates a schematic of a virtual reality studio based on the architecture of Figures 2 to 7;
  • Figure 10 shows a schematic of a more complex
  • a camera system interface 110 comprises: a camera system interface 110; a system frame of reference 120; a camera trajectory 130; a standard device protocol 140; a first minimum autonomous device 141; a second minimum autonomous device 142; a third minimum autonomous device 143; camera 195; and a camera system reference point 198.
  • minimum autonomous device is used to describe any device, comprising one or more electronic components, controlled or interfaced with typically by means of a device-specific electronic protocol. Examples of such devices include: a pedestal; a pan-and-tilt head; or a camera. Each of these devices may include multiple parts capable of individual interfacing or control, but they are essentially controlled together by means of specific
  • the components may be individually controlled.
  • the first minimal autonomous device 141 comprises a device interface 151, a device frame of reference 161 and a device trajectory 171.
  • the first minimum autonomous device 141 has four components. Each component has a joint frame of reference and a joint movement: the first component has joint frame of reference 181a and joint movement 191a; the second component has joint frame of reference 181b and joint movement 191b; the third component has joint frame of reference 181c and joint movement 191c and the fourth component has joint frame of reference 18 Id and joint movement 191d.
  • the second minimum autonomous device 142 has a device interface 152, a device frame of reference 162 and a device trajectory 172.
  • the second minimum autonomous device 142 has two components.
  • the first component has joint frame of reference 182a and joint movement 192a
  • the second component has joint frame of reference 182b and joint movement 192b.
  • the third minimum autonomous device has device
  • This third minimum autonomous device 143 has only one component, having joint frame of reference 183 and joint movement 193.
  • device interface and camera system interface are used herein, when these interfaces are acting to control an associated device, they may equivalently be referred to as a device controller or camera system
  • Each camera system comprises a single camera and its associated mounting and accessories.
  • the camera system interface 110 has recorded therein the device topological configuration and translates standard format camera
  • trajectory commands into standard format device control commands. Moreover, it combines device spatial data into camera trajectory.
  • camera system interface 110 When camera system interface 110 receives instructions to adjust the configuration of the camera system, it uses system frame of reference 120 and camera trajectory 130 to determine an instruction to be sent to control the
  • the camera system interface 110 then sends a network signal across a shared communication channel using standard device protocol 140 to each of the device
  • This standard device protocol 140 is a common communication protocol that is independent from the device-specific protocol used in each of the minimum autonomous devices.
  • the network signal includes the instruction for controlling the configuration of at least one of the minimum autonomous devices.
  • the camera system reference point 198 defines the basis (alternatively, datum) upon which the configuration of the camera system is made.
  • the instruction may be intended for one minimum
  • Device interface 151 identifies that the received network signal relates to minimum autonomous device 141. Using device frame of reference 161 and device
  • the components are controlled individually by the device interface 151, which analyses the received
  • the device interface 151 translates standard format (device independent) spatial control commands into local axis drive commands, and combinations of individual axis movements into device spatial moves.
  • This hierarchical framework and communication protocols also supports the communication of ancillary data in standard formats between levels.
  • FIG. 3 there is illustrated an exemplary architecture for controlling a robotic camera system, based on the architecture shown in Figure 2. Where the same features as shown in Figure 2 are illustrated, identical reference numerals have been used.
  • Camera system interface 110 receives a synchronisation or Genlock signal 111, and also has an interface to a control desk 200 via Ethernet link 112.
  • device interface 151 is configured to control a pedestal
  • device interface 152 is configured to control a pan and tilt head, mounted on the pedestal
  • device interface 153 is
  • the camera system interface 110 receives
  • the camera system interface 110 uses inverse position and velocity kinematics to break down the vector received from control desk 200 into individual instruction vectors for each minimum autonomous device.
  • the standard device protocol 140 comprises a series of standard fields used for controlling these devices.
  • the standard device protocol defines transmission of a set of vector coordinates, comprising a three- dimensional Cartesian coordinate position (x p , y p , z p ) and three rotational angles defining pitch roll and yaw (a p , ⁇ ⁇ , Y p ) .
  • the instruction indicates the movement of the pedestal within six degrees of freedom.
  • the camera system interface 110 provides a vector representation of the required movement instructions to the pan and tilt head via its device interface 152 using coordinates and rotational figures in six degrees of freedom.
  • the camera system interface 110 also provides
  • This instruction can comprise a zoom variable, and alternatively or additionally a focus variable.
  • the device interface 152 generates separate control signals for the components of the pan and tilt head, to control each axis of movement of the pan and tilt head individually. For example, a first control signal is
  • a second control signal is generated based on y p , and so on for each component.
  • FIG 4 there is shown an exemplary architecture for interfacing a camera system with a virtual reality or camera tracking system, based on that shown in Figure 2.
  • device interface 151 is
  • device interface 153 is configured to interface a camera lens .
  • Device interface 151 receives a position indicator from the pedestal and translates this into a message for
  • the position indicator from the pedestal may comprise multiple sensor outputs, which are individually received at the device interface 151. Each sensor output can be considered an individual parameter.
  • the device interface 151 then combines these individual
  • the message then comprises a vector representation of the position (or movement) in six degrees of freedom comprising three
  • Cartesian positional coordinates (x p , y p , z p ) and three rotational angles ( ⁇ ⁇ , ⁇ ⁇ , ⁇ ⁇ ) .
  • the pan and tilt head communicates positional information to its respective device interface 152.
  • Device interface 152 then
  • the camera lens reports zoom angle and focus parameters to its respective device
  • the camera system interface uses forward velocity and position kinematics to sum the received vector
  • FIG. 2 to 4. This comprises: a host 900; a USB interface 901; a CAN interface 902; a first serial interface 903; a second serial interface 904; a first Ethernet interface 905; a second Ethernet interface 906; a bus 910; and a processor 920.
  • the host 900 communicates with the device interfaces 151, 152 and 153 and the other parts of the system using the USB interface 901,. CAN interface 902, serial interfaces 903 and 904, and Ethernet interfaces 905 and 906. The host 900 also communicates with the processor 920 via bus 910.
  • the configuration of the camera system is stored on the camera system interface 110, for example information that the pan and tilt head is mounted on a pedestal.
  • the camera system interface 110 receives an instruction to adjust the camera image by a specific vector, it can perform the correct inverse kinematic functions to determine individual node vector demands from this global vector.
  • the camera system interface 110 can perform the correct forward kinematic summations of the vectors received from each device interface to produce a global transformation vector that describes the camera position and orientation with respect to a global reference frame.
  • Configuration of individual minimum autonomous devices can also be performed via the camera system
  • the device interface 151 comprises: host 1000, USB interface 1001; CAN interface 1002; first serial interface 1003; second serial interface 1004; first Ethernet interface 1005; second Ethernet interface 1006; bus 1010; Field-Programmable Gate Array (FPGA) 1020; digital
  • Host 1000 communicates with a specific device and with the camera system interface 110 using USB interface 1001, CAN interface, first and second serial interfaces 1003 and 1004, and first and second Ethernet interfaces 1005 and 1006.
  • the first Ethernet channel 1003 is used for all communications with the camera system interface 110.
  • the secondary Ethernet channel 1004 together with serial channels (RS 232/422), CAN and USB interfaces is used for communication with devices connected to the device interface 151, for example a motion controller or a legacy device enabling it to function on the architecture of the present invention.
  • the device interface 151 can accept many different transducer or communications inputs. Signal conditioning is performed on these inputs using block 1040 to convert them to 3.3V logic before being converted to the input/output (I/O) lines of an FPGA.
  • the FPGA contains no specific programming that enables it to convert the transducer inputs into a six degree of freedom transformation vector for any camera support device to which the device interface 151 is connected (for example pan and tilt head, pedestal, crane or other) .
  • the FPGA 1020 or host 1000 is also programmed such that it can perform any specific calibration and set up functions .
  • the system comprises: a studio system
  • the first camera system interface 110a has a first system frame of reference 120a and first camera trajectory 130a
  • the second camera system interface 110b has a second system frame of reference 120b and a second camera trajectory 130b
  • the third camera system interface 110c has a third system frame of reference 120c and a third camera trajectory 130c.
  • the first camera system has four minimum autonomous devices, each with a specific device interface: first minimum autonomous device 611 has device interface 511;
  • second device 612 has second device interface 512; third device 613 has third device interface 513; and fourth device 614 has fourth device interface 514.
  • the second camera system has two minimum autonomous devices: first device 621 has first device interface 521; and second device 622 has second device interface 522.
  • the third camera system has three devices: first device 631 has first device interface 531; second device 632 has second device interface 532 and third device 633 has third device interface 533.
  • studio system interface when this interface is acting to control one or more associated camera systems, it may equivalently be referred to as a studio system controller.
  • the architecture comprises control room 200 and studio 220, which are interfaced by studio interface 210.
  • the control room comprises server interface 202, user interface 204 and touch screen 206.
  • the studio 220 comprises a first camera system 230 and a second camera system 240.
  • the first camera system 230 comprises: a camera system interface 232; a first controllable device 234; a second controllable device 235; a third controllable device 236; and a camera 238.
  • the second camera system 240 comprises: a camera system interface 242; a first legacy interface 243a; a second legacy interface 243b; a first controllable device 244; a second controllable device 264; and a camera 248.
  • the user interface 204 and touch screen 206 are both coupled to server interface 202.
  • Server interface 202 is then linked via a local area network to the studio controller 210.
  • the studio controller 210 receives commands from the server interface 202 and
  • the server interface 210 is coupled to the first camera system interface 232 and to the second camera system interface 242.
  • the first camera system interface 232 and second camera system interface 242 then convert commands received from the studio interface 210 into commands for specific device interfaces.
  • Each of the controllable devices illustrated (including camera 238) but except devices 244 and 248 has an internal device interface (not shown) .
  • Devices 244 and 248 are legacy devices and an external device interface 243a and 243b respectively is used.
  • FIG. 9 there is shown a schematic of a virtual reality studio, based on the architecture of Figures 2 to 7.
  • the architecture comprises: virtual reality server 305; studio interface 210; first camera system 230; and second camera system 240.
  • the equipment shown in each of the two camera systems is the same as Figure 8, and this being the case, identical reference numerals have been used. Nevertheless, the first controllable device 234, second controllable device 235 and third controllable device 236 are all able to provide positional feedback.
  • Each of the devices can report a specific position (or movement) to their respective device interface, which converts this to a message according to the standard device protocol 140 and communicates it to its respective camera system interface 232 or camera system interface 242. Where multiple devices report a position (or movement) the
  • respective camera system interface combines this information to provide a combined message, which is then transmitted to the studio controller 210.
  • the studio interface 210 then combines the messages from the multiple camera systems into a message for transmission to the VR server 305.
  • the system comprises: first control room 300; second control room 310; third control room 320; first studio 220; second studio 250; and outside studio 260.
  • the first control room 300 comprises: a first server interface 202a; a first user interface 204a; a first touch screen 206a; and a network switch 290.
  • the second control room comprises a second server interface 202b; a second user interface 204b; and a second touch screen 206b.
  • the third control room comprises a third user interface 204c, which a studio controller 270a and a network switch 290.
  • the first studio 220 is the same as illustrated in Figures 8 and 9 (comprising two camera systems 230 and 240) , such that identical reference numerals have been used.
  • a studio interface 210 is coupled to the two camera systems of the first studio 220.
  • the second studio 250 comprises a first controlled device 256 and a camera 258.
  • the outside studio 260 has a first studio controller 270c, a first controlled device 266a, a second control device 266b, a first camera 268a and a second camera 268b.
  • the studio controller 270c for the third studio 260 is coupled to a wireless interface comprising wireless transceivers 275a and 275b.
  • the network switch 290 of the first control room 300 is also coupled to the VR computer 305.
  • the network switch 290 of the first control room 300, the server interface 202b of the second control room 310, the network switch 290 of the third control room 320, the studio interface 210 and the first transceiver 275a are all coupled to a local area network for interfacing between all of these devices.
  • specific embodiments of the invention have been described herein, the skilled person may contemplate various modifications and substitutions.
  • only one camera system having a plurality of minimum autonomous devices might be uses.
  • this camera system may not comprise any moveable parts, for example if all of the minimum autonomous devices of the camera system were cameras using electronic zoom technology.
  • ancillary data such as: a gain parameter; and a white balance parameter.
  • the present invention may be used for the control of or interface with other devices.
  • Such devices may use an associated specific device interface.
  • the device may be coupled to a camera system interface or a studio system interface.
  • Such devices may include: a prompter; a laser tracking system; and a Camera Control Unit (CCU) .
  • CCU Camera Control Unit
  • These devices may have their own specific interface, similar to either a device interface or a camera system interface. They might be adapted to receive commands using the common communication protocol directly from an appropriate interface.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

La présente invention se rapporte au contrôle ou à la surveillance d'un système de caméras comprenant une pluralité de dispositifs électriques ou contrôlables qui possèdent chacun un protocole respectif spécifique à un dispositif. Selon l'invention, un signal de réseau est transmis entre un contrôleur système ou une interface et des contrôleurs système ou des interfaces par le biais d'un canal de communication partagée. Un protocole de communication commun, indépendant du protocole spécifique à un dispositif est utilisé, et le signal de réseau identifie une combinaison de paramètres spatiaux qui représentent chacun une position spatiale comprise dans un degré de liberté respectif ou un mouvement spatial compris dans un degré de liberté respectif. Le signal de réseau peut être utilisé pour contrôler la configuration spatiale d'un dispositif contrôlable pour la capture d'images prises par un système de caméras, ou pour déterminer des informations spatiales sur la base de la pluralité de paramètres spatiaux. Le signal de réseau peut également être utilisé pour contrôler une configuration des dispositifs électriques. La présente invention se rapporte également à un contrôle et une surveillance combinés.
PCT/GB2010/001697 2009-09-11 2010-09-07 Interface et contrôle d'un système de caméras WO2011030097A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB0916014.4 2009-09-11
GB0916014A GB2473479A (en) 2009-09-11 2009-09-11 Camera system control and interface

Publications (1)

Publication Number Publication Date
WO2011030097A1 true WO2011030097A1 (fr) 2011-03-17

Family

ID=41277610

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2010/001697 WO2011030097A1 (fr) 2009-09-11 2010-09-07 Interface et contrôle d'un système de caméras

Country Status (2)

Country Link
GB (1) GB2473479A (fr)
WO (1) WO2011030097A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2499261B (en) * 2012-02-10 2016-05-04 British Broadcasting Corp Method and apparatus for converting audio, video and control signals

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1993006690A1 (fr) * 1991-09-17 1993-04-01 Radamec Epo Limited Systeme de reglage pour des cameras commandees a distance
WO1995011566A1 (fr) * 1993-10-20 1995-04-27 Videoconferencing Systems, Inc. Systeme adaptable de video conferences
EP0715453A2 (fr) * 1994-11-28 1996-06-05 Canon Kabushiki Kaisha Dispositif de commande pour caméra
WO2001013637A1 (fr) * 1999-08-12 2001-02-22 Honeywell Limited Systeme et procede de gestion de video numerique
US20010019355A1 (en) * 1997-04-21 2001-09-06 Masakazu Koyanagi Controller for photographing apparatus and photographing system
EP1353504A1 (fr) * 2002-04-12 2003-10-15 Movetech, S.L. Système de télécommande pour caméras de télévision
US20080316368A1 (en) * 2005-12-09 2008-12-25 Kuka Roboter Gmbh Method and Device For Moving a Camera Disposed on a Pan/Tilt Head Long a Given Trajectory

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040070514A1 (en) * 2002-04-17 2004-04-15 Collier Michael E. Universal protocol converter

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1993006690A1 (fr) * 1991-09-17 1993-04-01 Radamec Epo Limited Systeme de reglage pour des cameras commandees a distance
WO1995011566A1 (fr) * 1993-10-20 1995-04-27 Videoconferencing Systems, Inc. Systeme adaptable de video conferences
EP0715453A2 (fr) * 1994-11-28 1996-06-05 Canon Kabushiki Kaisha Dispositif de commande pour caméra
US20010019355A1 (en) * 1997-04-21 2001-09-06 Masakazu Koyanagi Controller for photographing apparatus and photographing system
WO2001013637A1 (fr) * 1999-08-12 2001-02-22 Honeywell Limited Systeme et procede de gestion de video numerique
EP1353504A1 (fr) * 2002-04-12 2003-10-15 Movetech, S.L. Système de télécommande pour caméras de télévision
US20080316368A1 (en) * 2005-12-09 2008-12-25 Kuka Roboter Gmbh Method and Device For Moving a Camera Disposed on a Pan/Tilt Head Long a Given Trajectory

Also Published As

Publication number Publication date
GB2473479A (en) 2011-03-16
GB0916014D0 (en) 2009-10-28

Similar Documents

Publication Publication Date Title
US11423792B2 (en) System and method for obstacle avoidance in aerial systems
US10264189B2 (en) Image capturing system and method of unmanned aerial vehicle
US6922206B2 (en) Videoconferencing system with horizontal and vertical microphone arrays
WO2009142332A1 (fr) Système de caméras vidéo hybrides
KR20180100608A (ko) 정적 및/또는 모션 장면을 캡처하기 위해 다중-카메라 네트워크의 이용을 위한 시스템 및 방법
JP2009241247A (ja) ステレオ画像型検出移動装置
JP6122501B2 (ja) ビデオ監視方法、デバイス及びシステム
KR20150145590A (ko) 수중 작업용 원격제어로봇 시스템 및 그 제어 방법
KR20200122323A (ko) 멀티 센서를 사용하여 옴니 스테레오 비디오를 캡처하는 시스템 및 방법
CN107640317B (zh) 无人驾驶飞行器系统
CN112866627B (zh) 一种三维视频监控方法及相关设备
WO2017088122A1 (fr) Procédé, dispositif et systèmes de commande de point de suivi
JP6218471B2 (ja) 撮像装置、外部装置、撮像システム、撮像装置の制御方法、外部装置の制御方法、撮像システムの制御方法、及びプログラム
CN109302546B (zh) 摄像头组件及电子设备
WO2011030097A1 (fr) Interface et contrôle d'un système de caméras
JP2001245280A (ja) カメラ制御システム、装置、方法、及びコンピュータ読み取り可能な記憶媒体
JP2004128646A (ja) 監視システムおよび制御装置
CN109302547B (zh) 摄像头组件及电子设备
US10306362B1 (en) Microphone remote positioning, amplification, and distribution systems and methods
KR20010076786A (ko) 가상현실기술을 이용한 로봇 원격제어 시스템 및 방법
KR20170011928A (ko) 초광각 카메라를 이용한 렌즈 왜곡 영상 보정 카메라 시스템 및 그가 적용된 tvi 장치
JP2015104106A (ja) カメラ操作装置及びそれを有する撮影システム
CN113612967B (zh) 一种监控区域摄像头自组网系统
CN117156267B (zh) 基于环境自适应的云台相机工作模式切换方法及系统
KR102622623B1 (ko) 3차원 영상 정보를 제공하기 위한 이동형 영상 촬영 장치, 이에 대한 방법 및 이를 포함하는 시스템

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10763222

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10763222

Country of ref document: EP

Kind code of ref document: A1