GB2473479A - Camera system control and interface - Google Patents

Camera system control and interface Download PDF

Info

Publication number
GB2473479A
GB2473479A GB0916014A GB0916014A GB2473479A GB 2473479 A GB2473479 A GB 2473479A GB 0916014 A GB0916014 A GB 0916014A GB 0916014 A GB0916014 A GB 0916014A GB 2473479 A GB2473479 A GB 2473479A
Authority
GB
United Kingdom
Prior art keywords
camera system
camera
interface
controllable
parameter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB0916014A
Other versions
GB0916014D0 (en
Inventor
Richard Arthur Lindsay
Philip Christopher Dalgoutte
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Videndum PLC
Original Assignee
Vitec Group PLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vitec Group PLC filed Critical Vitec Group PLC
Priority to GB0916014A priority Critical patent/GB2473479A/en
Publication of GB0916014D0 publication Critical patent/GB0916014D0/en
Priority to PCT/GB2010/001697 priority patent/WO2011030097A1/en
Publication of GB2473479A publication Critical patent/GB2473479A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • H04N23/661Transmitting camera control signals through networks, e.g. control via the Internet
    • H04N5/232

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

Control or monitoring of a camera system comprising a plurality of electrical or controllable devices, each of which has a respective device-specific protocol is described. A network signal is communicated between a system controller or interface 110 and device controllers or interfaces 151 across a shared communication channel. A common communication protocol 140, independent from the device-specific protocol is used and the network signal identifies a combination of spatial parameters 181a, 182a 183, 191a, 191b, 191c, 191d, 192a, 192b, each of which represents a spatial position or movement within a respective degree of freedom. The network signal can be used to control the spatial configuration of a controllable device 141, 142, 143 for capture of camera images by the camera system or to determine spatial information based upon the plurality of spatial parameters 181a, 182a 183, 191a, 191b, 191c, 191d, 192a, 192b and a configuration for the electrical devices. A device may be for example a pedestal, a pan and tilt head or a camera lens. The system may configure the camera position and orientation. The system may provide six degree of freedom coordinates to the devices. The device interfaces may be for example ETHERNET, USB, CAN or FPGA.

Description

CAMERA SYSTEM CONTROL AND INTERFACE
Technical Field of the Invention
The present invention relates to a method of controlling a camera system, a method for monitoring parameter information for a camera system and a method for interfacing a camera system with a camera system controller and camera system interface. It further relates to a system for controlling the capture of camera images using a camera system and a system for monitoring parameter information in a camera system. A device interface for controlling an associated controllable device that forms part of a camera system, a device interface for communicating a position or movement of an associated electrical device that forms part of a camera system and an integrated device interface are also considered.
Background to the Invention
Interest in electrically controlling and monitoring camera studios is increasing. A typical studio of this type comprises one or more camera systems, which are each made up of multiple electrical devices, such as a pedestal, pan and tilt head and a camera. Each device may be capable of movement across a limited range within a number of degrees of freedom. The position or movement of each device is controlled by providing electrical signals. Alternatively or additionally, a device may have other controllable parameters, such as the focus or zoom of the camera.
Changing the parameters for any device may affect the camera image obtained. In robotic camera studios, other control options are used.
Each device is typically controlled using a specific, respective protocol. Control commands are issued from a central controller. The controller may be designed to control a single camera system or a whole studio.
One existing controller is provided by Vinten (TM) Radamec Broadcast Robotics under the name Fusion, and an exemplary system configuration is shown in Fig. 1. The system comprises: server interface 10; server 20; screen 22; keyboard 24; mouse 26; input interface 28; video switcher 30; Ethernet hub 40; camera system interfaces 50; and camera systems 60. Each camera system 60 comprises: pedestal 62; pan and tilt head 65; and lens controller 68, each of which have multiple components.
Commands for controlling each camera system 60 are generated at the server 20 based on user input. These commands include instructions for multiple components of multiple parts of a camera system 60, for example generating separate instructions for controlling each of the different axes of movement for the pan and tilt head 65. The commands are communicated through server interface 10 and Ethernet hub 40 to a camera system interface 50. The camera system interface 50 then forwards these commands to the respective device in the camera system, each component of the respective device being driven separately. One camera system interface 50 also sends commands to the components of two different camera systems.
As the number of components within a camera system increases, this configuration becomes more complex and difficult to set up. Also, as the number of camera systems within a studio increases, the growth in complexity makes this arrangement difficult to scale efficiently.
Moreover, many devices are also equipped to provide monitoring information, which may include information about their current position or movement. Cameras may also be able to provide information about the object upon which they are focussed. It is desirable to communicate this information back to the central controller for positioning feedback and for use in virtual reality studios.
Summary of the Invention
Against this background, the present invention provides a method of controlling a camera system, monitoring a camera system or both.
In a first aspect, there is provided a method of controlling a camera system for capturing camera images. The camera system comprises a plurality of controllable devices, each of which has a respective device control protocol. The method comprises: determining, at a camera system controller, an instruction to be sent to control the configuration of one or more of the plurality of controllable devices; communicating a network signal from the camera system controller across a shared communication channel to a plurality of device interfaces using a common communication protocol that is independent from the device control protocol of each of the plurality of controllable devices, the network signal comprising the instruction for controlling the configuration of the plurality of controllable devices; identifying, at a first device interface from the plurality of device interfaces, that the received network signal relates to a controllable device associated with the first device interface, based upon the received network signal; and generating a control signal at the first device interface, based upon the received instruction, so as to control the configuration of the associated controllable device for capture of camera images by the camera system.
In a camera mounting system having multiple connected devices, control of the devices to yield a defined camera trajectory can be determined and effected for a specific combination and configuration of devices. The camera system provides a communications and processing architecture having the flexibility and scalability to control camera trajectory in mountings comprising variable numbers, types and configurations of devices. By employing a standardised device-independent spatial control and monitoring protocol to instruct or monitor the individual devices, such variability can be accommodated with minimal impact at a system level.
In particular, by using a common communication protocol over a shared communication channel, a simple command can be used to control any device within a camera system. This provides a control architecture that is readily scalable, because adding an additional device to the system requires only an additional device interface in terms of hardware.
Thus, scalability and flexibility are improved. Moreover, where the device interface is integrated with the controllable device, the device can be added to the camera system even more easily.
Advantageously, the camera system includes a camera and the method further comprises: capturing the camera images using the camera; and controlling the configuration of the associated controllable device using the generated control signal, so as to cause a change in the captured camera images.
Optionally, the method further comprises: acquiring a topology for the plurality of controllable devices at the camera system controller. For example, the camera system controller may receive information about each of the controllable devices and their physical relationship with respect to one another. This information may be programmed or it may be received electrically or electronically from one or more controllable devices. In such embodiments, the step of determining an instruction to control the configuration of one or more of the plurality of controllable devices may be based on the acquired topology for the plurality of controllable devices. In other words, the instructions sent to each device are made with an awareness of the relative spatial constraints on each device.
In the preferred embodiment, the network signal further comprises a device identifier that identifies one of the plurality of controllable devices. Then, the step of identifying is based on the device identifier in the received network signal.
Alternatively, the method further comprises: identifying, at a second device interface from the plurality of device interfaces, that the received network signal relates to a controllable device associated with the second device interface, based upon the received network signal; and generating a control signal at the second device interface, based upon the received instruction, so as to control the configuration of the associated controllable device for capture of camera images by the camera system.
Advantageously, each device may determine its own instructions based on a general, standardised instruction provided by the camera system controller to multiple devices.
Optionally, the shared communication channel may be a packet-based routed network (such as Ethernet). For example, an Internet Protocol-based network may be used.
The control signal is preferably a first control signal. Then, the method further comprises generating a second control signal at the first device interface, based upon the received instruction, so as to control the configuration of the associated controllable device for capture of camera images by the camera system. In this way, the device interface is able to control separately individual components of the device, for example motors dictating separate axes of movement.
Preferably, the common communication protocol comprises transmission of a spatial identifier as part of the network signal, the spatial identifier being representative of a spatial configuration for at least one controllable device.
The spatial identifier may be a vector or spatial matrix representation. Preferably, the spatial identifier comprises six spatial parameters, each spatial parameter identifying a spatial position or movement within a respective degree of freedom.
In some embodiments, the plurality of controllable devices comprises a camera. In other embodiments, the camera is not controlled by the camera system controller. Where the camera is controlled by the camera system controller, the common control protocol may comprise transmission of a camera parameter, the camera parameter being representative of a configuration for the camera. Optionally, the camera parameter is representative of a zoom configuration of the camera. Additionally or alternatively, the camera parameter is representative of a focus configuration of the camera.
In some embodiments, the method further comprises transmitting configuration information from the first device interface to the camera system controller across the shared communication channel, the configuration information indicating at least one limitation in the configuration of the controllable device associated with the first device interface. In this way, each device can inform the camera system controller about its relevant constraints, in particular physical limitations. These allow it to determine control instructions more effectively.
At least one controllable device from the plurality of controllable devices may be one Of: a pan and tilt head; a camera pedestal; or a crane.
In another aspect, the present invention resides in a method of controlling a camera studio comprising a first camera system for capturing camera images and a second camera system for capturing camera images. The method comprises: controlling the first camera system according to the method as previously described using a first camera system controller; and controlling the second camera system according to the method as previously described using a second camera system controller. In some embodiments, the shared communication channel for the first camera system and the shared communication channel for the second camera system are the same. Optionally, the first camera system controller and the second camera system controller are the same. Preferably, the method further comprises communicating an instruction for control of the first camera system from a studio system controller to the first camera system controller using the common communication protocol.
In a further aspect, the present invention provides a method for monitoring parameter information for a camera system. The camera system comprises a plurality of electrical devices each of which has at least one transducer and a respective device communications protocol. The method comprises: detecting a parameter using a first transducer of a first electrical device from the plurality of electrical devices; communicating the detected parameter from the first transducer to a device interface associated with the first electrical device using the respective device communications protocol; generating parameter data based upon the detected parameter at the device interface; communicating a network signal from the device interface across a shared communication channel to a camera system interface using a common communication protocol that is independent from the device communication protocol of each of the plurality of electrical devices, the network signal comprising the parameter data; and determining parameter information at the camera system interface based upon the received parameter data and a configuration for the plurality of electrical devices.
In the same way as control of the camera system can be made more scalable and flexible by using a common communication protocol over a shared communication channel, the monitoring or feedback functions within a camera system that direct information in the reverse direction than the control functions can also be made more scalable and flexible. The device interface is thereby able to combine parameters received from multiple transducers (such as sensors) into a common parameter data for the device.
Similarly, the camera system interface is able to combine parameter data from individual devices into parameter information for the camera system.
Optionally, the network signal further comprises a device identifier that identifies the first electrical device. Preferably, the shared communication channel may be a packet-based routed network (such as Ethernet). For example, an Internet Protocol-based network may be used.
In the preferred embodiment, the first transducer detects a first parameter and the method further comprises: detecting a second parameter using a second transducer of the first electrical device; and communicating the detected second parameter from the second transducer to the device interface using the respective device communications protocol. Then, the step of generating the parameter data comprises combining the first detected parameter and the second detected parameter. This combination is particularly advantageous when the detected parameter comprises a position or movement, as it may allow the device interface to combine data from different axes in a vector formulation.
Advantageously, the detected parameter comprises a position or movement, and wherein the parameter information comprises a position or movement. In some such embodiments, the common communication protocol comprises transmission of a spatial identifier as part of the network signal, the spatial identifier being representative of a position or movement of an electrical device. Preferably, the spatial identifier comprises six spatial parameters, each spatial parameter identifying a spatial position or movement within a respective degree of freedom. Advantageously, the position or movement relates to the camera system. Alternatively, the position or movement may relate to an object upon which a camera of the camera system is focussed.
-10 -An electrical device may have one transducer or multiple transducers. For example, a first transducer may detect a first parameter and a second transducer may detect a second parameter.
In a yet further aspect, the present invention may reside in a method for monitoring parameter information for a first camera system and a second camera system. The method comprises: monitoring parameter information for the first camera system according to the further aspect of the present invention; and monitoring parameter information for the second camera system according to the further aspect of the present invention. In some embodiments, the shared communication channel for the first camera system and the shared communication channel for the second camera system are the same. Optionally, the camera system interface for the first camera system and the camera system interface for the second camera system are the same. Preferably, the method further comprises communicating a position or movement from the camera system interface of the first camera system to a studio system controller using the common communication protocol.
In an integrated aspect of the present invention, a method for interfacing a camera system with a camera system controller and camera system interface is provided. The method comprises: the method for monitoring parameter information for the camera system according to the further aspect of the present invention; and the method for controlling the camera system according to the first aspect of the present invention..
Optionally, when the parameter information for the camera system comprises a position or movement, the method further comprises: receiving a command at the camera system -11 -controller indicating a trajectory for the camera of the camera system. The step of determining an instruction to control the configuration of one or more of the plurality of controllable devices is based on the determined position or movement and on the received command.
In a different aspect of the present invention, there is provided a system for controlling the capture of camera images using a camera system, comprising: a camera system, comprising a plurality of controllable devices each of which has a configuration which can be controlled using a respective device control protocol; a plurality of device interfaces, each device interface being associated with a controllable device; and a camera system controller, arranged to determine an instruction to be sent to control the configuration of one or more of the plurality of controllable devices and to communicate a network signal across a shared communication channel to the plurality of device interfaces using a common communication protocol that is independent from the device control protocol of each of the plurality of controllable devices, the network signal comprising the instruction for controlling the configuration of the plurality of controllable devices. Each of the plurality of device interfaces is configured to identify that the received network signal relates to a controllable device associated with the respective device interface, based upon the received network signal, and to generate a control signal, based upon the received instruction, so as to control the configuration of the associated controllable device for capture of camera images by the camera system.
Optionally, the plurality of controllable devices includes a camera arranged to capture camera images.
-12 -Another different aspect of the present invention provides a system for monitoring parameter information for a camera system, comprising: a camera system interface; a plurality of device interfaces, each which is arranged to communicate a network signal across a shared communication channel to the camera system interface using a common communication protocol; and a camera system, comprising a plurality of electrical devices each of which has a respective device communications protocol and at least one transducer that is arranged to detect a parameter and to communicate parameter data based upon the detected parameter to a device interface from the plurality of device interfaces using the respective device communications protocol. The common communication protocol is independent from the device communication protocol of each of the plurality of electrical devices, the network signal comprises the detected parameter, and the camera system interface is further arranged to determine parameter information based upon the received detected parameter and a configuration for the plurality of electrical devices. The detected parameter is optionally a position or movement for the camera system.
An alternative aspect of the present invention may be found in a device interface for controlling an associated controllable device that forms part of a camera system, the device interface comprising: a receiver, arranged to receive a network signal from a shared communication channel, the network signal comprising a spatial identifier representative of a plurality of spatial parameters, each spatial parameter identifying a spatial position or movement within a degree of freedom; a detector, arranged to identify that the received network signal relates to the associated -13 -controllable device; and an output, arranged to generate a control signal for controlling the associated controllable device on the basis of the received spatial identifier.
A further alternative aspect of the present invention may reside in a device interface for communicating a position or movement of an associated electrical device that forms part of a camera system, the device interface comprising: an input, arranged to receive an indicator signal, the indicator signal being indicative of the position or movement of the associated mechanical device; a processor, arranged to determine a spatial identifier on the basis of the indicator signal, the spatial identifier being representative of a plurality of spatial parameters, each spatial parameter identifying a spatial position or movement within a degree of freedom; and a transmitter, arranged to transmit a network signal comprising the spatial identifier over a shared communication channel.
An integrated device interface may be provided, comprising: the device interface according to the alternative aspect of the present invention; and the device interface according to the further alternative aspect of the present invention.
In a yet further alternative aspect of the present invention, a method of communicating between a camera system for capturing camera images and a camera system interface is provided, the camera system comprising a plurality of electrical devices each of which has a respective device communication protocol. The method comprises: communicating a network signal between the camera system controller and a device interface across a shared communication channel, using a common communication protocol that is independent from the device communication protocol of each of the -14 -plurality of electrical devices; and identifying an electrical device associated with the device interface based on the received network signal.
Brief Description of the Drawings
The invention may be put into practice in various ways, a number of which will now be described by way of example only and with reference to the accompanying drawings in which: Figure 1 shows an existing arrangement for controlling a plurality of camera systems; Figure 2 shows an abstraction of an interface architecture for controlling a camera system according to the present invention; Figure 3 illustrates an exemplary architecture for controlling a robotic camera system based on Figure 2; Figure 4 depicts an exemplary architecture for interfacing a camera system with a virtual reality or camera tracking system based on Figure 2; Figure 5 shows exemplary hardware for the camera system interface of Figures 2 to 4; Figure 6 illustrates exemplary hardware for one of the device interfaces shown in Figures 2 to 4; Figure 7 depicts an abstraction of an interface architecture for a studio, comprising a plurality of camera systems according to the present invention; Figure 8 shows a schematic of a simple arrangement for controlling a studio with two camera systems based on the architecture of Figures 2 to 7; Figure 9 illustrates a schematic of a virtual reality studio based on the architecture of Figures 2 to 7; and -15 -Figure 10 shows a schematic of a more complex arrangement for interfacing a plurality of studios to a control and virtual reality system, based on Figures 2 to 9.
Detailed Description of Preferred Embodiments
Referring first to Figure 2, there is shown an abstraction of a camera interface architecture according to the present invention. The architecture abstraction comprises: a camera system interface 110; a system frame of reference 120; a camera trajectory 130; a standard device protocol 140; a first minimum autonomous device 141; a second minimum autonomous device 142; a third minimum autonomous device 143; camera 195; and a camera system reference point 198.
The term minimum autonomous device is used to describe any device, comprising one or more electronic components, controlled or interfaced with typically by means of a device-specific electronic protocol. Examples of such devices include: a pedestal; a pan-and-tilt head; or a camera. Each of these devices may include multiple parts capable of individual interfacing or control, but they are essentially controlled together by means of specific interface, such as serial commands. The components may be individually controlled.
The first minimal autonomous device 141 comprises a device interface 151, a device frame of reference 161 and a device trajectory 171. The first minimum autonomous device 141 has four components. Each component has a joint frame of reference and a joint movement: the first component has joint frame of reference lBla and joint movement 191a; the second component has joint frame of reference lslb and joint movement 191b; the third component has joint frame of -16 -reference 181c and joint movement 191c and the fourth component has joint frame of reference 181d and joint movement l9ld.
The second minimum autonomous device 142 has a device interface 152, a device frame of reference 162 and a device trajectory 172. The second minimum autonomous device 142 has two components. The first component has joint frame of reference 182a and joint movement 192a, and the second component has joint frame of reference 182b and joint movement 192b.
The third minimum autonomous device has device interface 153, device frame of reference 163, and device trajectory 173. This third minimum autonomous device 143 has only one component, having joint frame of reference 183 and joint movement 193.
Although the term device interface and camera system interface are used herein, when these interfaces are acting to control an associated device, they may equivalently be referred to as a device controller or camera system controller respectively.
Each camera system comprises a single camera and its associated mounting and accessories. The camera system interface 110 has recorded therein the device topological configuration and translates standard format camera trajectory commands into standard format device control commands. Moreover, it combines device spatial data into camera trajectory.
When camera system interface 110 receives instructions to adjust the configuration of the camera system, it uses system frame of reference 120 and camera trajectory 130 to determine an instruction to be sent to control the configuration of at least one of the minimum autonomous -17 -devices. The camera system interface 110 then sends a network signal across a shared communication channel using standard device protocol 140 to each of the device interfaces 151, 152 and 153. This standard device protocol 140 is a common communication protocol that is independent from the device-specific protocol used in each of the minimum autonomous devices. The network signal includes the instruction for controlling the configuration of at least one of the minimum autonomous devices. The camera system reference point 198 defines the basis (alternatively, datum) upon which the configuration of the camera system is made.
The instruction may be intended for one minimum autonomous device, or more than one minimum autonomous device. As an example, consider the case where the instruction was intended for the first minimum autonomous device 141. Device interface 151 identifies that the received network signal relates to minimum autonomous device 141. Using device frame of reference 161 and device trajectory 171, it then generates at least one control signal, based on the received instruction, so as to control the configuration of one or more of the components of minimum autonomous device 141.
The components are controlled individually by the device interface 151, which analyses the received instruction and, if necessary, decomposes the instruction into component instructions in order to generate individual control signals that are sent to each component. This results in a change to the configuration of the first minimum autonomous device 141, and consequently a change to the configuration of the camera system as a whole. Thus, the images generated by camera 159 are changed based on this change to the configuration.
-18 -Thus, the device interface 151 translates standard format (device independent) spatial control commands into local axis drive commands, and combinations of individual axis movements into device spatial moves. This hierarchical framework and communication protocols also supports the communication of ancillary data in standard formats between levels.
Turning next to Figure 3, there is illustrated an exemplary architecture for controlling a robotic camera system, based on the architecture shown in Figure 2. Where the same features as shown in Figure 2 are illustrated, identical reference numerals have been used. Device interface 151, device interface 152, and device interface 153 are coupled to camera system interface 110 through real-time network or bus 135. Camera system interface 110 receives a synchronisation or Genlock signal 111, and also has an interf ace to a control desk 200 via Ethernet link 112.
In this example, it may be considered that device interface 151 is configured to control a pedestal, device interface 152 is configured to control a pan and tilt head, mounted on the pedestal, and device interface 153 is configured to control a camera lens mounted on the pan and tilt head. The camera system interf ace 110 receives commands from control desk 200 for configuring the camera position and orientation. These may be provided in the form of a vector. The camera system interface 110 then determines instructions to be sent to each of the pedestal, pan and tilt head and lens in order to effect this command.
The camera system interface 110 uses inverse position and velocity kinematics to break down the vector received from -19 -control desk 200 into individual instruction vectors for each minimum autonomous device.
These instructions are then formulated using the standard device protocol 140, which comprises a series of standard fields used for controlling these devices. For the pedestal, the standard device protocol defines transmission of a set of vector coordinates, comprising a three-dimensional Cartesian coordinate position (xv, y, z) and three rotational angles defining pitch roll and yaw (, , Yp). In this way, the instruction indicates the movement of the pedestal within six degrees of freedom. Similarly, the camera system interface 110 provides a vector representation of the required movement instructions to the pan and tilt head via its device interface 152 using coordinates and rotational figures in six degrees of freedom.
The camera system interface 110 also provides instructions to device interface 153, which controls the camera lens. This instruction can comprise a zoom variable, and alternatively or additionally a focus variable.
The device interface 152 generates separate control signals for the components of the pan and tilt head, to control each axis of movement of the pan and tilt head individually. For example, a first control signal is generated based on x, a second control signal is generated based on yp, and so on for each component.
Looking now at Figure 4, there is shown an exemplary architecture for interfacing a camera system with a virtual reality or camera tracking system, based on that shown in Figure 2. In a similar way to the example shown in Figure 3, it may be considered that device interface 151 is configured to interface with a pedestal, device interface 152 is configured to interface with a pan and tilt head, and -20 -device interface 153 is configured to interface a camera lens.
Device interface 151 receives a position indicator from the pedestal and translates this into a message for transmission over the real-time network or bus 135 using the standard device protocol 140. The position indicator from the pedestal may comprise multiple sensor outputs, which are individually received at the device interface 151. Each sensor output can be considered an individual parameter. The device interface 151 then combines these individual parameters into a set of parameter data, which forms the basis for the message to be transmitted.
As described with reference to Figure 3, the message then comprises a vector representation of the position (or movement) in six degrees of freedom comprising three Cartesian positional coordinates (xv, Yp' z) and three rotational angles (c, f3,, yp). Similarly, the pan and tilt head communicates positional information to its respective device interface 152. Device interface 152 then communicates this information to the camera system interface using the standard device protocol 140, such that a vector describing the position (or movement) in six degrees of freedom is transmitted. The camera lens reports zoom angle and focus parameters to its respective device interface 153, which reports this information using the vector representation of the standard device protocol 140 over real-time network or bus 135 to the camera system interface 110.
The camera system interface then uses forward velocity and position kinematics to sum the received vector information and provide a positional vector and orientation for the camera thereby. This is sent to a VR rendering -21 -computer 305 over an RS232/422 serial interface 113 or an Ethernet link 112.
Referring next to Figure 5, there is depicted an exemplary hardware for the camera system interface 110 of Figures 2 to 4. This comprises: a host 900; a USB interface 901; a CAN interface 902; a first serial interface 903; a second serial interface 904; a first Ethernet interface 905; a second Ethernet interface 906; a bus 910; and a processor 920.
The host 900 communicates with the device interfaces 151, 152 and 153 and the other parts of the system using the USB interface 901, CAN interface 902, serial interfaces 903 and 904, and Ethernet interfaces 905 and 906. The host 900 also communicates with the processor 920 via bus 910.
The configuration of the camera system is stored on the camera system interface 110, for example information that the pan and tilt head is mounted on a pedestal. In this way, when the camera system interface 110 receives an instruction to adjust the camera image by a specific vector, it can perform the correct inverse kinematic functions to determine individual node vector demands from this global vector. Conversely, the camera system interface 110 can perform the correct forward kinematic summations of the vectors received from each device interface to produce a global transformation vector that describes the camera position and orientation with respect to a global reference frame. Configuration of individual minimum autonomous devices can also be performed via the camera system interface 110, such that direct connection of any minimum autonomous device to a specific controller is not required.
Referring now to Figure 6, there is illustrated exemplary hardware for one of the device interfaces shown in -22 -Figures 2 to 4. The device interface 151 comprises: host 1000, USB interface 1001; CAN interface 1002; first serial interface 1003; second serial interface 1004; first Ethernet interface 1005; second Ethernet interface 1006; bus 1010;
Field-Programmable Gate Array (FPGA) 1020; digital
input/output 1030; signal conditioning block 1040; and input/output 1050. Host 1000 communicates with a specific device and with the camera system interface 110 using USB interface 1001, CAN interface, first and second serial interfaces 1003 and 1004, and first and second Ethernet interfaces 1005 and 1006. The first Ethernet channel 1003 is used for all communications with the camera system interface 110. The secondary Ethernet channel 1004 together with serial channels (RS 232/422), CAN and USB interfaces is used for communication with devices connected to the device interface 151, for example a motion controller or a legacy device enabling it to function on the architecture of the present invention.
The device interface 151 can accept many different transducer or communications inputs. Signal conditioning is performed on these inputs using block 1040 to convert them to 3.3V logic before being converted to the input/output (I/O) lines of an FPGA. The FPGA contains no specific programming that enables it to convert the transducer inputs into a six degree of freedom transformation vector for any camera support device to which the device interface 151 is connected (for example pan and tilt head, pedestal, crane or other) . The FPGA 1020 or host 1000 is also programmed such that it can perform any specific calibration and set up functions.
Referring now to Figure 7 there is depicted an abstraction of an interface architecture for a studio, -23 -comprising a plurality of camera systems according to the present invention. Specifically, three camera systems are illustrated. The system comprises: a studio system interface 410; a studio frame of reference 420; multiple camera trajectories 430; a standard system protocol 440; a first camera system interface llOa; a second camera system interface ilOb; and a third camera system interface hOc.
The first camera system interface llOa has a first system frame of reference 120a and first camera trajectory 130a, the second camera system interface hlOb has a second system frame of reference 120b and a second camera trajectory l3Ob, and the third camera system interface hOc has a third system frame of reference 120c and a third camera trajectory l3Oc.
The first camera system has four minimum autonomous devices, each with a specific device interface: first minimum autonomous device 611 has device interface 511; second device 612 has second device interface 512; third device 613 has third device interface 513; and fourth device 614 has fourth device interface 514. The second camera system has two minimum autonomous devices: first device 621 has first device interface 521; and second device 622 has second device interface 522. The third camera system has three devices: first device 631 has first device interface 531; second device 632 has second device interface 532 and third device 633 has third device interface 533.
Although the term studio system interface is used herein, when this interface is acting to control one or more associated camera systems, it may equivalently be referred to as a studio system controller.
Turning next to Figure 8, there is shown a schematic of a simple arrangement for controlling a studio with two -24 -camera systems, based on the architecture of Figures 2 to 7.
The architecture comprises control room 200 and studio 220, which are interfaced by studio interface 210. The control room comprises server interface 202, user interface 204 and touch screen 206. The studio 220 comprises a first camera system 230 and a second camera system 240. The first camera system 230 comprises: a camera system interface 232; a first controllable device 234; a second controllable device 235; a third controllable device 236; and a camera 238. The second camera system 240 comprises: a camera system interface 242; a first legacy interface 243a; a second legacy interface 243b; a first controllable device 244; a second controllable device 264; and a camera 248.
In the control room 200, the user interface 204 and touch screen 206 are both coupled to server interface 202.
Server interface 202 is then linked via a local area network to the studio controller 210. The studio controller 210 receives commands from the server interface 202 and translates these into commands for specific camera system interfaces. The server interface 210 is coupled to the first camera system interface 232 and to the second camera system interface 242. The first camera system interface 232 and second camera system interface 242 then convert commands received from the studio interface 210 into commands for specific device interfaces. Each of the controllable devices illustrated (including camera 238) but except devices 244 and 248 has an internal device interface (not shown). Devices 244 and 248 are legacy devices and an external device interface 243a and 243b respectively is used.
Referring next to Figure 9, there is shown a schematic of a virtual reality studio, based on the architecture of -25 -Figures 2 to 7. The architecture comprises: virtual reality server 305; studio interface 210; first camera system 230; and second camera system 240. The equipment shown in each of the two camera systems is the same as Figure 8, and this being the case, identical reference numerals have been used.
Nevertheless, the first controllable device 234, second controllable device 235 and third controllable device 236 are all able to provide positional feedback.
Each of the devices can report a specific position (or movement) to their respective device interface, which converts this to a message according to the standard device protocol 140 and communicates it to its respective camera system interface 232 or camera system interface 242. Where multiple devices report a position (or movement) the respective camera system interface combines this information to provide a combined message, which is then transmitted to the studio controller 210. The studio interface 210 then combines the messages from the multiple camera systems into a message for transmission to the VR server 305.
Looking now at Figure 10, there is shown a schematic of a more complex arrangement for interfacing a plurality of studios to a control and virtual reality system based on Figures 2 to 9. The system comprises: first control room 300; second control room 310; third control room 320; first studio 220; second studio 250; and outside studio 260. the first control room 300 comprises: a first server interface 202a; a first user interface 204a; a first touch screen 206a; and a network switch 290. The second control room comprises a second server interface 202b; a second user interface 204b; and a second touch screen 206b. The third control room comprises a third user interface 204c, which a studio controller 270a and a network switch 290.
-26 -The first studio 220 is the same as illustrated in Figures 8 and 9 (comprising two camera systems 230 and 240), such that identical reference numerals have been used. A studio interface 210 is coupled to the two camera systems of the first studio 220. The second studio 250 comprises a first controlled device 256 and a camera 258. The outside studio 260 has a first studio controller 270c, a first controlled device 266a, a second control device 26Gb, a first camera 268a and a second camera 268b. The studio controller 270c for the third studio 260 is coupled to a wireless interface comprising wireless transceivers 275a and 275b.
The network switch 290 of the first control room 300 is also coupled to the VR computer 305.
The network switch 290 of the first control room 300, the server interface 202b of the second control room 310, the network switch 290 of the third control room 320, the studio interface 210 and the first transceiver 275a are all coupled to a local area network for interfacing between all of these devices.
Whilst specific embodiments of the invention have been described herein, the skilled person may contemplate various modifications and substitutions. For example, only one camera system having a plurality of minimum autonomous devices might be uses. Moreover, this camera system may not comprise any moveable parts, for example if all of the minimum autonomous devices of the camera system were cameras using electronic zoom technology.
Although the communication of positional, movement and camera parameters using the invention have been described, other parameters may alternatively be communicated. These -27 -may include ancillary data such as: a gain parameter; and a white balance parameter.
In addition or alternatively to the electronic or controllable devices described above, the present invention may be used for the control of or interface with other devices. Such devices may use an associated specific device interface. Alternatively, the device may be coupled to a camera system interface or a studio system interface. Such devices may include: a prompter; a laser tracking system; and a Camera Control Unit (CCU). These devices may have their own specific interface, similar to either a device interface or a camera system interface. They might be adapted to receive commands using the common communication protocol directly from an appropriate interface.

Claims (36)

  1. -28 -CLAIMS: 1. A method of controlling a camera system for capturing camera images, the camera system comprising a plurality of controllable devices each of which has a respective device control protocol, the method comprising: determining, at a camera system controller, an instruction to be sent to control the configuration of one or more of the plurality of controllable devices; communicating a network signal from the camera system controller across a shared communication channel to a plurality of device interfaces using a common communication protocol that is independent from the device control protocol of each of the plurality of controllable devices, the network signal comprising the instruction for controlling the configuration of the plurality of controllable devices; identifying, at a first device interface from the plurality of device interfaces, that the received network signal relates to a controllable device associated with the first device interface, based upon the received network signal; and generating a control signal at the first device interface, based upon the received instruction, so as to control the configuration of the associated controllable device for capture of camera images by the camera system.
  2. 2. The method of claim 1, wherein the camera system includes a camera, the method further comprising: capturing the camera images using the camera; and -29 -controlling the configuration of the associated controllable device using the generated control signal, so as to cause a change in the captured camera images.
  3. 3. The method of claim 1 or claim 2, further comprising: acquiring a topology for the plurality of controllable devices at the camera system controller; and wherein the step of determining an instruction to control the configuration of one or more of the plurality of controllable devices is based on the acquired topology for the plurality of controllable devices.
  4. 4. The method of any of claims 1 to 3, wherein the network signal further comprises a device identifier that identifies one of the plurality of controllable devices, and wherein the step of identifying is based on the device identifier in the received network signal.
  5. 5. The method of any of claims 1 to 3, further comprising: identifying, at a second device interface from the plurality of device interfaces, that the received network signal relates to a controllable device associated with the second device interface, based upon the received network signal; and generating a control signal at the second device interface, based upon the received instruction, so as to control the configuration of the associated controllable device for capture of camera images by the camera system.
  6. 6. The method of any preceding claim, wherein the control signal is a first control signal and further comprising: -30 -generating a second control signal at the first device interface, based upon the received instruction, so as to control the configuration of the associated controllable device for capture of camera images by the camera system.
  7. 7. The method of any preceding claim, wherein the common communication protocol comprises transmission of a spatial identifier as part of the network signal, the spatial identifier being representative of a spatial configuration for at least one controllable device.
  8. 8. The method of any preceding claim, wherein the plurality of controllable devices comprises a camera.
  9. 9. The method of claim 8, wherein the common control protocol comprises transmission of a camera parameter, the camera parameter being representative of a configuration for the camera.
  10. 10. The method of claim 9, wherein the camera parameter is representative of a zoom configuration of the camera.
  11. 11. The method of claim 9 or claim 10, wherein the camera parameter is representative of a focus configuration of the camera.
  12. 12. The method of any preceding claim, further comprising: transmitting configuration information from the first device interface to the camera system controller across the shared communication channel, the configuration information indicating at least one limitation in the configuration of -31 -the controllable device associated with the first device interface.
  13. 13. The method of any preceding claim, wherein at least one controllable device from the plurality of controllable devices is a pan and tilt head.
  14. 14. The method of any preceding claim, wherein at least one controllable device from the plurality of controllable devices is a camera pedestal.
  15. 15. A method of controlling a camera studio comprising a first camera system for capturing camera images and a second camera system for capturing camera images, the method comprising: controlling the first camera system according to the method of any preceding claim using a first camera system controller; and controlling the second camera system according to the method of any preceding claim using a second camera system controller.
  16. 16. The method of claim 15, further comprising: communicating an instruction for control of the first camera system from a studio system controller to the first camera system controller using the common communication protocol.
  17. 17. A method for monitoring parameter information for a camera system, the camera system comprising a plurality of electrical devices each of which has at least one transducer -32 -and a respective device communications protocol, the method comprising: detecting a parameter using a first transducer of a first electrical device from the plurality of electrical devices; communicating the detected parameter from the first transducer to a device interface associated with the first electrical device using the respective device communications protocol; generating parameter data based upon the detected parameter at the device interface; communicating a network signal from the device interface across a shared communication channel to a camera system interface using a common communication protocol that is independent from the device communication protocol of each of the plurality of electrical devices, the network signal comprising the parameter data; and determining parameter information at the camera system interface based upon the received parameter data and a configuration for the plurality of electrical devices.
  18. 18. The method of claim 17, wherein the network signal further comprises a device identifier that identifies the first electrical device.
  19. 19. The method of claim 17 or claim 18, wherein the first transducer detects a first parameter, the method further comprising: detecting a second parameter using a second transducer of the first electrical device; and -33 -communicating the detected second parameter from the second transducer to the device interface using the respective device communications protocol; and wherein the step of generating the parameter data comprises combining the first detected parameter and the second detected parameter.
  20. 20. The method of any of claims 17 to 19, wherein the detected parameter comprises a position or movement, and wherein the parameter information comprises a position or movement.
  21. 21. The method of claim 19, wherein the common communication protocol comprises transmission of a spatial identifier as part of the network signal, the spatial identifier being representative of a position or movement of an electrical device.
  22. 22. The method of claim 7 or claim 20, wherein the spatial identifier comprises six spatial parameters, each spatial parameter identifying a spatial position or movement within a respective degree of freedom.
  23. 23. A method for monitoring parameter information for a first camera system and a second camera system, the method comprising: monitoring parameter information for the first camera system according to the method of any of claims 17 to 22; and monitoring parameter information for the second camera system according to the method of any of claims 17 to 22.
    -34 -
  24. 24. A method for interfacing a camera system with a camera system controller and camera system interface, the method comprising: the method for monitoring the parameter information for the camera system of any of claims 17 to 22; and the method for controlling the camera system of any of claims 1 to 14 or claim 22.
  25. 25. The method of claim 24 when dependent upon any one of claims 20 to 22, further comprising: receiving a command at the camera system controller indicating a trajectory for the camera of the camera system; and wherein the step of determining an instruction to control the configuration of one or more of the plurality of controllable devices is based on the determined position or movement and on the received command.
  26. 26. A system for controlling the capture of camera images using a camera system, comprising: a camera system, comprising a plurality of controllable devices each of which has a configuration which can be controlled using a respective device control protocol; a plurality of device interfaces, each device interface being associated with a controllable device; and a camera system controller, arranged to determine an instruction to be sent to control the configuration of one or more of the plurality of controllable devices and to communicate a network signal across a shared communication channel to the plurality of device interfaces using a common communication protocol that is independent from the device control protocol of each of the plurality of controllable -35 -devices, the network signal comprising the instruction for controlling the configuration of the plurality of controllable devices; and wherein each of the plurality of device interfaces is configured to identify that the received network signal relates to a controllable device associated with the respective device interface, based upon the received network signal, and to generate a control signal, based upon the received instruction, so as to control the configuration of the associated controllable device for capture of camera images by the camera system.
  27. 27. The system of claim 26, wherein the plurality of controllable devices includes a camera arranged to capture camera images.
  28. 28. A system for monitoring parameter information for a camera system, comprising: a camera system interface; a plurality of device interfaces, each which is arranged to communicate a network signal across a shared communication channel to the camera system interface using a common communication protocol; and a camera system, comprising a plurality of electrical devices each of which has a respective device communications protocol and at least one transducer that is arranged to detect a parameter and to communicate parameter data based upon the detected parameter to a device interface from the plurality of device interfaces using the respective device communications protocol; wherein the common communication protocol is independent from the device communication protocol of each -36 -of the plurality of electrical devices, the network signal comprises the detected parameter; and wherein the camera system interface is further arranged to determine parameter information based upon the received detected parameter and a configuration for the plurality of electrical devices.
  29. 29. A device interface for controlling an associated controllable device that forms part of a camera system, the device interface comprising: a receiver, arranged to receive a network signal from a shared communication channel, the network signal comprising a spatial identifier representative of a plurality of spatial parameters, each spatial parameter identifying a spatial position or movement within a degree of freedom; a detector, arranged to identify that the received network signal relates to the associated controllable device; and an output, arranged to generate a control signal for controlling the associated controllable device on the basis of the received spatial identifier.
  30. 30. A device interface for communicating a position or movement of an associated electrical device that forms part of a camera system, the device interface comprising: an input, arranged to receive an indicator signal, the indicator signal being indicative of the position or movement of the associated mechanical device; a processor, arranged to determine a spatial identifier on the basis of the indicator signal, the spatial identifier being representative of a plurality of spatial parameters, -37 -each spatial parameter identifying a spatial position or movement within a degree of freedom; and a transmitter, arranged to transmit a network signal comprising the spatial identifier over a shared communication channel.
  31. 31. An integrated device interface, comprising: the device interface of claim 29; and the device interface of claim 30.
  32. 32. A method of communicating between a camera system for capturing camera images and a camera system interface, the camera system comprising a plurality of electrical devices each of which has a respective device communication protocol, the method comprising: communicating a network signal between the camera system controller and a device interface across a shared communication channel, using a common communication protocol that is independent from the device communication protocol of each of the plurality of electrical devices; and identifying an electrical device associated with the device interface based on the received network signal.
  33. 33. A method of controlling a camera system, substantially as herein described with references to Figures 2 to 10.
  34. 34. A method for monitoring parameter information for a camera system, substantially as herein described with reference to Figures 2 to 10.
    -38 -
  35. 35. A system for controlling the capture of camera images using a camera system, substantially as herein described with reference to Figures 2 to 10.
  36. 36. A system for monitoring parameter information for a camera system, substantially as herein described with reference to Figures 2 to 10.
GB0916014A 2009-09-11 2009-09-11 Camera system control and interface Withdrawn GB2473479A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB0916014A GB2473479A (en) 2009-09-11 2009-09-11 Camera system control and interface
PCT/GB2010/001697 WO2011030097A1 (en) 2009-09-11 2010-09-07 Camera system control and interface

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB0916014A GB2473479A (en) 2009-09-11 2009-09-11 Camera system control and interface

Publications (2)

Publication Number Publication Date
GB0916014D0 GB0916014D0 (en) 2009-10-28
GB2473479A true GB2473479A (en) 2011-03-16

Family

ID=41277610

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0916014A Withdrawn GB2473479A (en) 2009-09-11 2009-09-11 Camera system control and interface

Country Status (2)

Country Link
GB (1) GB2473479A (en)
WO (1) WO2011030097A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2499261A (en) * 2012-02-10 2013-08-14 British Broadcasting Corp Converting Audio, Video and Control Signals and data streams for a network

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040070514A1 (en) * 2002-04-17 2004-04-15 Collier Michael E. Universal protocol converter

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1993006690A1 (en) * 1991-09-17 1993-04-01 Radamec Epo Limited Setting-up system for remotely controlled cameras
SG67927A1 (en) * 1993-10-20 1999-10-19 Videoconferencing Sys Inc Adaptive videoconferencing system
EP0715453B1 (en) * 1994-11-28 2014-03-26 Canon Kabushiki Kaisha Camera controller
JP4332231B2 (en) * 1997-04-21 2009-09-16 ソニー株式会社 Imaging device controller and imaging system
AUPQ217399A0 (en) * 1999-08-12 1999-09-02 Honeywell Limited Realtime digital video server
EP1353504A1 (en) * 2002-04-12 2003-10-15 Movetech, S.L. Remote control system for television cameras
DE102005058867B4 (en) * 2005-12-09 2018-09-27 Cine-Tv Broadcast Systems Gmbh Method and device for moving a camera arranged on a pan and tilt head along a predetermined path of movement

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040070514A1 (en) * 2002-04-17 2004-04-15 Collier Michael E. Universal protocol converter

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2499261A (en) * 2012-02-10 2013-08-14 British Broadcasting Corp Converting Audio, Video and Control Signals and data streams for a network
GB2499261B (en) * 2012-02-10 2016-05-04 British Broadcasting Corp Method and apparatus for converting audio, video and control signals

Also Published As

Publication number Publication date
WO2011030097A1 (en) 2011-03-17
GB0916014D0 (en) 2009-10-28

Similar Documents

Publication Publication Date Title
US11233943B2 (en) Multi-gimbal assembly
US11423792B2 (en) System and method for obstacle avoidance in aerial systems
US10264189B2 (en) Image capturing system and method of unmanned aerial vehicle
US6922206B2 (en) Videoconferencing system with horizontal and vertical microphone arrays
WO2009142332A1 (en) Hybrid video camera syste
US10917560B2 (en) Control apparatus, movable apparatus, and remote-control system
JP6122501B2 (en) Video surveillance method, device and system
KR101677303B1 (en) Camera device, camera system, control device and program
KR20200122323A (en) System and method for capturing omni stereo video using multiple sensors
CN112866627B (en) Three-dimensional video monitoring method and related equipment
CN107431749B (en) Focus following device control method, device and system
JP6218471B2 (en) IMAGING DEVICE, EXTERNAL DEVICE, IMAGING SYSTEM, IMAGING DEVICE CONTROL METHOD, EXTERNAL DEVICE CONTROL METHOD, IMAGING SYSTEM CONTROL METHOD, AND PROGRAM
EP2648406B1 (en) Method for switching viewing modes in a camera
JP2000083246A (en) Camera control system, camera control method, and recording medium stored with program to execute processing thereof
GB2473479A (en) Camera system control and interface
JP2001245280A (en) Camera control system, device, method, and computer- readable storage medium
CN109302546B (en) Camera assembly and electronic equipment
JP2004128646A (en) Monitoring system and controller
US10306362B1 (en) Microphone remote positioning, amplification, and distribution systems and methods
KR20170055455A (en) Camera system for compensating distortion of lens using super wide angle camera and Transport Video Interface Apparatus used in it
Chae et al. The comparison of the detecting performance between the ground and the aerial visual analytics in the UGV-UAV collaborative system
JP2015104106A (en) Camera operation device and imaging system with the same
CN117156267B (en) Cloud deck camera working mode switching method and system based on environment self-adaption
CN113612967B (en) Monitoring area camera ad hoc network system
KR102196683B1 (en) Device and method for photographing 3d tour

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)