WO2011022755A1 - A method and apparatus for relative control of multiple cameras - Google Patents

A method and apparatus for relative control of multiple cameras Download PDF

Info

Publication number
WO2011022755A1
WO2011022755A1 PCT/AU2010/000886 AU2010000886W WO2011022755A1 WO 2011022755 A1 WO2011022755 A1 WO 2011022755A1 AU 2010000886 W AU2010000886 W AU 2010000886W WO 2011022755 A1 WO2011022755 A1 WO 2011022755A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
halo
primary
camera
motion picture
Prior art date
Application number
PCT/AU2010/000886
Other languages
French (fr)
Inventor
Jeremy Anderson
Original Assignee
Trace Optics Pty Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2009904169A external-priority patent/AU2009904169A0/en
Application filed by Trace Optics Pty Ltd filed Critical Trace Optics Pty Ltd
Priority to US13/392,515 priority Critical patent/US20120154593A1/en
Priority to EP10811004.0A priority patent/EP2474162B8/en
Priority to CN201080038605.XA priority patent/CN102598658B/en
Priority to AU2010286316A priority patent/AU2010286316B2/en
Priority to JP2012525813A priority patent/JP5806215B2/en
Publication of WO2011022755A1 publication Critical patent/WO2011022755A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • H04N7/185Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/41Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items
    • G06V20/42Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items of sport video content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment

Definitions

  • the present invention relates generally to the field of camera control systems and in one aspect relates to the control of at least two cameras for capturing different images of an object moving across a surface wherein a primary image is contained, and movable within, a halo image, the position of the halo image being dependent upon the movement of the object.
  • Televised sporting events are extremely popular on both free-to-air and pay television, with many channels being solely dedicated to sport. With the advent of more advanced camera technology, quality has increased and new camera shots have been achieved. Cameras located in cricket stumps and inside race cars are now common.
  • the present invention provides an alternative where by the cameras can be controlled automatically using servos and encoders enabling an autofocus, and auto zoom, auto pan and tilt.
  • This system enables the camera to receives control signals from a control means to facilitate the capturing of imagery of the game.
  • the cost of placing a skilled camera operator behind each camera is one of the limitations of the manually controlled systems.
  • the placement of cameras around the perimeter of the playing field is restricted.
  • a further limitation of a manually controlled system is that camera operators can obscure the action of the sport or stage productions, when close ups are needed as is the case with boxing and ice hockey.
  • first and second motion picture cameras being controlled such that the position of the halo image relative to the primary image can be altered.
  • the first and second motion picture cameras being controlled such that primary image retains a portion of the halo image and the position of the halo image relative to the primary image can be altered.
  • the object may be a ball being used in a sporting contest, wherein the primary and halo images include motion picture footage of at least the ball.
  • the primary and halo images may further include motion picture footage of an individual or individuals engaged in the sporting contest, goals, wickets or relevant line markings.
  • the quality and framing of the dynamic primary image is defined by the field of view (zoom) and subject distance (focus) of a lens of said first camera and the camera's alignment on the servo pan tilt head.
  • the quality and framing of the dynamic halo image is defined by the field of view and subject distance of a lens of the second camera and the camera's alignment on the servo pan tilt head.
  • the shape of the primary image and halo image can be, but are not limited to circles, ovals, squares and rectangles.
  • the primary image and halo image defined by respective field of views and subject distances, can be altered. This is important because the composition of camera footage that is the most desirable for a viewer will vary depending upon the behaviour of the player or players engaged in play. In this way close up footage of the object, such as a particular sports player can be captured with one camera whilst automatically capturing with secondary cameras the wider area around the player that may include opposing players that may contests for the ball, or team mates to which the ball may be passed.
  • the object being tracked is a ball being used to play a sport such as soccer or basketball and the motion picture primary image and halo images move thereby include the ball and individual or individuals engaged in play or other images of audience interest.
  • the term play referred to the progress of the game in which the individual player or players are actively engaged in.
  • the halo image may be positioned forwardly of one side around the primary image, wherein the halo image extends forward of the player and includes defending players that are in close proximity to the first player and that may engage them in play within a short period of time.
  • the method may use at least one primary image contained within at least one halo image.
  • an operator may use the halo image or multiple halo images.
  • the primary and halo images may be locked onto a predefined object, including an RF tag or movable point herein referred to as a track node, which may follow the game ball, player or vehicle.
  • a track node refers to a series of points having x, y, z coordinates within a mathematical model that is created by surveying and mapping the surface of a selected area.
  • the track node may replicate, within the
  • the size of the primary and halo images can be individually adjusted.
  • the images' size can also be set as either a percentage of the primary image, or as an adjustable fixed size, or as a variable logarithmic percentage of the primary image.
  • the size of a halo may also be determined via the position of the track node within a bias zone.
  • the bias zone may have predefined parameters that control the position of the primary and halo images around the tagged object or track node.
  • the predefined parameters are preferably stored in software.
  • Primary and halo images are preferably controlled by software to facilitate the often complex requirements of correct framing of any given sport or activity.
  • the following basic summary alerts the reader to some of the complexities of these interactions.
  • the images encircle the tracked object, have offset limit lines that keep the tracked object within specified boundaries. These boundaries can be thought of as a fence that stops the tracked object from exiting.
  • the images also have location fields within the limit lines. The location field positions the image around the tracked object depending on the tracked objects position within the bias zone which typically covers the entire playing arena and the direction of travel which is an Operator adjusted function.
  • the space where images can be moved is also restricted by the bump bars, which are typically located just outside the boundary of the playing field or performance space.
  • the images may have limit lines, which are parallel line to the image's external edge that can be offset at specified distances or at a percentage of the image's diameter or longest side. Images are designed to capture the tracked object or track node within the image's limit lines.
  • the limit lines effectively give the object or player being framed some space around them before the edge of the television picture frame.
  • the limit lines also have a variable cushioning effect that enable the track node to have a range of hard to soft collisions with the limit line. This cushioning effect enables a smoother visual motion picture without jerky changes in direction.
  • the limit lines can be outside the image, thereby enabling the track node to be captured but still outside the image.
  • the limit lines can be offset from the outside edge of the image, and the methods of offset include, a specified distance, specified percentage of the diameter or diagonal, and a combination of both percentage and specified minimum and maximum distances.
  • the relationship between the primary and halo images is relative to, and controlled by, a control means.
  • the size of the primary image may be proportional, to the halo image. This proportion relationship may be directly or inversely proportional or be linear or exponential.
  • each image has a location field that consist of an x, y and z axis that typically bisects through the centre of the image.
  • Location fields have variable patterns, which include but are not limited to, orthogonal patterns with one or two axis, curved grid patterns, parabolic patterns, or concentric circle patterns.
  • the track node which is the object being tracked interacts with the following; the location fields, the direction of travel, and the bias zones to enable the correct motion picture framing of the tracked object within the televisions picture frame.
  • the location field adjusts the position of the track node along its x axis, proportion to the direction of travel of the track node.
  • the location field adjusts the position of the track node along its y axis, proportion to the track node's position within the bias zones. Further information on the methods of interaction between track nodes, location fields, direction of travel and bias zones are contained in subsequent sections.
  • the images movement, size, position and relationship with each other, may vary depending on the tracked object's velocity, direction of travel, behaviour, position with the bias zone and relative direction with respect to the physical location of the first or second camera.
  • the relationship may also be altered depending upon the character of the object being tracked. For instance where a player is being tracked their movement and behaviour will be restricted to a narrow flat band adjacent a playing surface. In contrast the movement and behaviour of a football being kicked would be quite different and would be within a broader band that extends upwardly from the playing surface. Accordingly the relationship may be altered by the trajectory or expected trajectory of the ball. In such a situation the dynamic primary image may follow the trajectory of the ball whist the dynamic second halo image may capture footage of the expected landing area that has been calculated from the trajectory of the ball. Typically the primary image is positioned within the halo image, although it should be appreciated that the halo image may be separated from the primary image.
  • the halo image may be uncoupled from the primary image such that the second camera is directed at the goal when the track node or ball comes into contact with the specified area.
  • the uncoupling of the halo image from the primary image may be done automatically by way of computer software when the target object is located within a predetermined space such as the goal square. Alternatively this uncoupling can be performed via the user interface and in one form a switch may be used. The uncoupling of the images or halos may also occur when footage of the crowd, coach's box, or other predetermined areas is required. This uncoupling and repositioning of the second camera may be performed by separate control switches.
  • multiple halo images can surround the primary image and each halo image can have its own specified size.
  • the capturing of the images is controlled by software that may include, bias zones, bump bars, direction of travel, framing limit lines, split button, and proportional head room framing. Individual halo images may be able to interact with the software while the primary image may not interact. The operator can individually activate or deactivate each image's interaction with the software.
  • a motion picture capturing apparatus including, a first camera for capturing a dynamic primary image of a moving object, a second camera for capturing a dynamic halo image that extends around the primary image, and a control means for controlling the position of said dynamic halo image around the said dynamic primary image.
  • the first camera and all secondary cameras are controlled by servo-assisted pan tilt heads and servo assisted lenses that control the focus and zoom.
  • the control means further controls the pan, tilt, zoom and focus of the respective first and all secondary cameras.
  • the relationship between the primary image and all halo images may be altered by use of the control means that may include a user interface and designated software.
  • This user interface may include a touch screen, which shows live video and a synchronised 3D model of the playing area.
  • the control means may require the synchronisation of the virtual 3D computer generated environment with a camera's real world view of the same environment. This synchronisation enables the operator to see the overlayed 3D model, such as a soccer field line markings, over the video. This enables the operator to working in the 3D model computer world while still seeing what is happening via the video.
  • This synchronisation typically requires: the calibration and charting of the servo encoded lens's zoom and focus; a 3D model of the environment created either by surveying the environment or by having a knowing standard environment such as tennis court; the cameras having known 3D locations with associated x, y, z coordinates and the pitch and yaw of the horizontal plane of the camera head is also known; and each camera being mounted onto a servo encoded pan tilt head.
  • This synchronisation enables a computer to determine the camera's field of view via the encoder's reading of the pan, tilt, zoom and focus settings. As a result the operator sees an accurate virtual 3D model superimposed over the real world video. Thus when a camera's field of view moves, then the synchronised 3D model also precisely moves in real time.
  • This synchronisation now enables one human operator to accurately command and control in real time multiple cameras around a designated area and see the camera vision and the superimposed 3D geometric and spatial software functions working. This can enable far superior accuracy of framing and focusing on dynamic targets.
  • control means further includes a broadcast switching device to enable the operator to select the footage that is to be broadcast or recorded.
  • the component of the apparatus, such as the cameras, display means and control means may be connected by way of a communication means such as, but not limited to, a modem communication path, a computer network such as a local area network
  • LAN local area network
  • Internet wide area network
  • RF radio frequency
  • the processor and the memory cooperate with each other and with other components of a computer to perform all of the functionality described herein.
  • the processor executes appropriate software to perform all of the functionality described herein.
  • the control means is a computer including RAM and ROM memory, a central processing unit or units, input/output (10) interfaces and at least one data storage device.
  • the computer includes application software for controlling the cameras and performing functions, stored in a computer readable medium on a storage device.
  • the apparatus may include embedded software or firmware with corresponding hardware that is designed to perform one or more dedicated functions.
  • the processor and the memory cooperate with each other and with other components of a computer to perform all of the functionality described herein.
  • control means includes a computer monitor with a virtual model or map of the playing surface which overlays in real time over the
  • the virtual model may include such things as the boundaries of the playing surface, goals and relevant line markings. It is within the computer model that the operator can command and control and see the various geometric and spatial software functions working over the camera's video.
  • a motion picture capturing apparatus including, a first camera for capturing a dynamic primary image of a moving object, at least one second camera for capturing a dynamic halo image that substantially extends around the primary image, and a control means for controlling the position of said dynamic images.
  • a track node may be stored within software to facilitate the positioning of the said primary and halo images.
  • Track nodes are mathematical points that can be assigns to track vehicles, players or the match ball to give them a positional reference.
  • the real time position of the track node is governed by, but not limited to GPS devices, RF tagging devices, optical recognition devices, and manual tracking using either a mouse or a stylist on a touch screen.
  • Images can be individually assigned to specified track nodes.
  • Track node can spatially interact with the images in a variety of ways.
  • a track node may be locked onto the cutting plane there by setting the height of the track node away from the playing surface, while allowing the track node to travel across the cutting plane in any direction, speed and acceleration.
  • the track node can also be offset from the cutting plane in a variety of methods that include but not limited to, a wheel on a mouse, a wheel within a control interface, and depressing a button and using a touch screen stylist to move the stylist either up or down the touch screen.
  • the computer uses the position of the track node to calculate the subject distance for the lenses' focus settings, thereby enabling the area around the track node to always be in focus.
  • the subject distance is the distance from the lens to the subject or tracked target.
  • Multiple track nodes can be utilised where there are multiple targets requiring tracking.
  • Nominated cameras can be exclusively assigned to specified track nodes while interacting with the software devises.
  • the cutting plane enables the images to have the z-axis position as the cutting planes surface.
  • the cutting plane is a mathematical plane contained within software that is offset from the playing surface at variable heights.
  • the plane can be parallel to a designated surface, or it can be a curved or variable surface over the playing field or surface.
  • the cutting plane can also be shaped into any profile such as a plane that is offset 1 meter and parallel to a complex and undulating motor racing track.
  • cutting planes will extend well beyond the primary playing area into secondary areas, such as the surrounding playing areas, grand stands and vehicular run off areas.
  • the primary function of the cutting plane is to allow the track nodes, and thereby the captured images to travel across the cutting plane's surface or be offset from it.
  • the cutting plane enables better accuracy when tracking motor vehicles because the vehicles height from the racing track is always known (unless the vehicle is flying), therefore GPS tracking inaccuracies in the Z direction or height can be removed.
  • a bias zone contained within the software interacts with the track node's position within the bias zone to dictate how the images are positioned around the track node.
  • Bias zones have variable patterns that include but are not limited to: orthogonal patterns with one or two axis, or concentric circle or oval patterns.
  • the track node may travel either side of the bias zones' x axis and the further the track node is away from the x axis, then the further away the track node is from the image's x axis while still staying within the image's limit line.
  • Multiple bias zones may also be utilised, for example an orthogonal bias zone covering an entire soccer field and two concentric circle bias zones each with a 30m radius centred on each goal.
  • the resultant effect on the halo images around the track node is based on the averaging of the two bias zones effect, which of course is dependant on the track nodes position with the bias zones.
  • a direction of travel function may be stored within the software and in one form may be manually controlled via an adjustable slide device which as a neutral middle position and variable forward and back calibrations.
  • the direction of travel creates leading space forward or behind the track node within the images.
  • 90% forward on the slide results in the track node being located 90% back from of the images centre, there generating a very large leading space within the halo image in front of the track node.
  • the magnitude of the leading space or distance between the track node's position and the offset from the image location field's y axis is proportional to the magnitude of the direction of travel. Which side of the images that the leading space occurs, is governed by the operator and is typically dependant on which way the ball is going.
  • a bump bar function may be stored within the software.
  • Bump bars are a software spatial ordering function that enable the images to bump into them, but generally do not let the images pass over their geometric alignment. Bump bars are like a fence that can be aligned where required, to frame the perimeter of the playing field. Bump bars have a variable deceleration setting that enables the halo images to cushion into the bump bars before contact occurs.
  • the images have 3 optional functionalities that enable them to, firstly, recognise bump bars and cushion into them, secondly to ignore the bump bars and their associated functions, and thirdly a hybrid option where the halo images use the bump bars until the primary halo crosses the bump bar at which point the halo image will continue to surround the primary image as both images cross over the bump bars.
  • the bump bars stop the specified images from departing the area of the playing field, thereby keeping the cameras field of view on the playing surface and on the players.
  • a picture frame function may be stored within the software.
  • the picture frame is a software ordering function that graphically shows the camera's "16 x 9 picture plane" around the captured image.
  • the sides of the picture frame always touch the images' external edges relative to the viewing alignment of the camera. As such if the image expands then the picture frame expands.
  • the sill and head heights of the picture frame and the centre of the picture frame can be set in a variety of methods. Firstly, the bottom alignment of the picture frame or sill can have an vertical offset distance from either the cutting plane or track surface at the track node's location, secondly the picture frame can be set so that a specified horizontal axis or band of the picture frame always retains the track node on it while the picture frame holds the entire captured image, and thirdly the side of the captured image closest to the camera will rest on the picture frame's sill.
  • An additional over riding function on the height of the picture frame head height is the proportional head room function which interacts with the size of the images and the height of the cutting plane so that when the picture frame's top alignment has reached a certain specified height above the playing surface, then the picture frame's height will not drop any further and if the picture frame needs to reduce in size because of a contracting image size, then the picture frame's bottom alignment or sill will rise allowing for the picture frame to shrink in size.
  • This proportional framing function can also be used in an inverse fashion, so that the operator can zoom in on the player's feet in a similar manner.
  • Picture frames and the visual limit plane have a geometric relationship that stop the picture frame from passing across a visual limit plane.
  • a visual limit plane function may be stored within the software.
  • a visual limit plane is of any size and shape that can be positioned at any horizontal, vertical or angular alignment.
  • the visual limit plane is a spatial software function that enables the camera's view to be restricted from looking past a specified alignment or plane.
  • the visual limit plane affect the camera's zoom, pan and tilt. In a typical sporting application like soccer, the visual limit plane will be located just under the roof line of the stadium, and when the wide field of view camera and its associated wide image are tracking a player on the far side of the field then the head of the picture frame would contact the visual limit plane and stop the camera's field from seeing under the stadium roof and push the camera's field of view further onto the playing field where the action is.
  • Visual limit planes can be set individually for each camera and are particularly useful when located just under the roof of stadiums, stage boundaries, or edges of unsightly structure.
  • the operator can set the visual limit planes and bump bars in appropriate positions within the 3D model which is superimposed over the real time video and examine all camera views for functionality and aesthetic composition.
  • a split button function may be stored within the software and enables the operator to push a button, there by releasing the specified images from the cutting plane to follow a target such as a basketball through a' path of travel.
  • the system recognises the track node's location and draws a base line from that point to the designated target point which can be the centre of the basketball or netball hoop.
  • an image tally light function may be stored within the software.
  • the image tally light may highlight the live feed camera's halo or picture frame.
  • a vista line function may be stored within the software and creates a series of lines within the virtual 3D computer model that start at a camera location and extend to the tangent points on both sides of that camera's images. The lines may be terminated at either the image's tangents, or cutting plane ,or designated distance past the image. Similarly the centre vista line starts at the camera location and extends to the track node and may terminate as at the track node, or cutting plane, or designated distance.
  • a hierarchy of commands function may be stored within the software. Many of the aforementioned functions interrelate with each other and in some circumstances may desire to over ride each other. As such a hierarchy of commands is structured within the system requirements, enabling commands to over rule other commands.
  • a relative zoom points function may be stored within the software.
  • This software function enables a point on the cutting plane to be selected i.e. the soccer goals, and for that point to stay in the same location within the camera's field of view as the operator zooms in or out either by manual controls or in a preset manner.
  • This software command can also utilise the camera's picture plane via the systems understanding of the lens's field of view.
  • a pan point function may be stored within the software and enables the operator to select two points, a genesis point and a terminus point, where by the designated camera will pan between these points along a designated path.
  • This designated path or spline can be adjusted by the operator to form any alignment within a 3D space.
  • the zoom setting or key framing at the genesis and terminus points and at any number of points along the spline can be designated so that the lens' zoom extrapolates evenly between them as the camera's centre of view pans along the spline. Time, zoom settings, and speed between the pan points can be specified.
  • Figure 1 is a schematic view of a primary image and the surrounding halo
  • Figure 2 is a schematic view of a first embodiment of the apparatus for camera control of the present invention
  • Figure 3a is a schematic view of the various configurations of the primary image area and surrounding halo image area of figure 1 illustrating the bump bars around the periphery of the playing arena;
  • Figure 3b is a schematic view of a primary and halo images and their interaction pattern as they move within the bias zone, showing that the interaction pattern is firstly based upon the position of the track node within the bias zone and secondly the position of the bump bars;
  • Figure 3c is a schematic view of a fixed size primary and halo images and their interaction pattern as they move within the circular bias zone;
  • Figure 3d is a schematic view of a fixed size primary image and variable size halo image and their interaction pattern as they move within the circular bias zone;
  • Figure 3e is a schematic view of a halo and its component parts
  • Figure 3f is a schematic view of some of the embodiments of a halo
  • Figure 3g is a schematic view of a bias zone and its component parts
  • Figure 3h is a schematic view of some of the embodiments of a bias zone
  • Figure 4a is a schematic view of the primary image of figure 1 illustrating a first embodiment of the vertical barrier above the playing surface;
  • Figure 4b is a schematic view illustrating a second embodiment of the vertical boundary above the playing surface
  • Figure 5 is a schematic view illustrating a further embodiment
  • Figure 6 is an overhead view of the movement of a player across a playing surface illustrating the position of the images captured by the first and second cameras.
  • the motion picture capturing apparatus includes a first camera 12 for capturing a dynamic primary image 14 of an object 16, the primary image 14 being defined by the field of view 18 and subject distance 20 of the lens 22 of the first camera 12.
  • the apparatus 10 further including a second camera 24 for capturing a dynamic halo image 26 that contains and extends around the primary image 14, the halo image being defined by the field of view 28 and subject distance 30 of the lens 32 of the second camera 24.
  • the dimensions of at least the halo image 26 and the position of the primary image 14 therewithin may be altered depending upon the direction of travel and behaviour of the object 16.
  • the apparatus 10 can be used to capture footage of a sporting contest, such as a game of soccer.
  • the first and second cameras 12, 24 are placed around a playing surface in this example being a soccer field 34 having a boundary line 36, various field markings 38 and opposing goals 40, 42.
  • a third camera 44 is configured to capture an image 46 of the playing field 34.
  • Signals are received from and sent to cameras 12, 24 and 44 by way of communication means 48.
  • the communication means 48 may be hard wired to the cameras or be connected by way of a transmitted/receiver.
  • the communication means 48 is connected to a control means 50, including a touch screen 52, for displaying image 46, and stylus 54, for controlling the images captured by the first and second cameras 12, 24, and a broadcast switcher 56 in communication with a broadcast tower 58 for controlling the television images broadcast.
  • the broadcast switcher 56 includes switches 60, 62 for selecting the desired images for broadcasting.
  • the object 16 is a soccer player 64 who is kicking a ball 66 down the field 34 in the direction of arrow 68 which indicates the direction of travel.
  • the direction of travel is communicated to the apparatus 50 via the joystick 74.
  • the image 46 of the field is displayed on the touch screen 52.
  • the operator uses the stylus 54 to positions the track node 11 in the centre of the play between the soccer player 64 and the soccer ball 66.
  • the size of the images can be controlled via the rotation of the joystick's knob 75.
  • the movement of the stylus 54 across the display means 52 generates digital signals representative of the required panning, tilting, focusing and zoom operations of the cameras 12, 24 and their lenses 22, 32 to track an object 16 across surface 34.
  • the operator can either select to follow an individual player that is in control of the ball or the ball itself depending upon the required shots and whether the ball is being passed between players.
  • the movement of the stylus 54 across the screen 52 results in corresponding movement of cameras 12, 24. It should however be appreciated that the users finger or tracking subsystems could be used instead of the stylus 54 to track movement of the object 16 across the touch screen 52.
  • the stylus 54 is used to control the first camera such that the track node 11 of the primary halo corresponds to the position of the stylus 54 on the image 46 displayed on the screen 52. In the present embodiment, the position of the stylus 54 controls the position the halo 26 around the primary image 14.
  • the images 14, 26 captured by the first and second cameras 12, 24 are displayed on screens 70, 72.
  • the screens 70, 72 are used so that the operator can select the best image for broadcasting.
  • the display means 52 may include the images captured by the cameras or the apparatus may include a separate split screen displaying the images captured by the various connected cameras.
  • the apparatus 10 utilises a joystick 74 for controlling the direction of travel although in another form this joystick 74 can be used for controlling the position of the images around the track node 11.
  • the joystick knob 75 may also be used to control the dimensions of the primary and/or halo images.
  • the computer includes application software for controlling the computer, receiving data from the screen 52, stylus 54 and joystick 74.
  • the software is configured to generate appropriate signals to control the servo-assisted camera heads and encoded lenses that control pan, tilt, focus and zoom of the cameras 12, 24 depending upon the signals received from the screen 52, stylus 54 and joystick 74.
  • Application software may be stored in a computer.
  • the lenses 22, 32 are calibrated either by using the manufactures data or by setting up the camera and lens in a known environment and recording the focus and zoom settings at variable distances and variable fields of view. Encoders recognise these focus and zoom settings and this data is stored, alternatively the analogy settings of the lens may be used but will not be as accurate. System algorithms utilise this data to enable automated lens control. Thus focus for each lens is achieved by knowing the distance between the camera location and the track node 11. The lens's zoom is achieved by knowing the size of the halo 14 and the distance between camera 12 and halo 14 then applying the calibrated lenses' algorithms to facilitate the correct field of view (zoom).
  • the camera's servo driven pan tilt heads are also encoded thereby enabling the system to recognise, command and control the direction of the camera's alignment.
  • the camera control system can be used to record images of various sporting activities.
  • the apparatus 10 can be used to capture footage of a basketball game played on a basketball court 76 having court markings 78, a boundary line 80 and opposing hoops 82 and 84.
  • the control means 50 includes a virtual map of the surface of the playing surface. This virtual map includes respective court marking, boundary line and position of the basketball hoop.
  • the vertical map also includes a virtual barrier or bump bar 86 that constrains the movement of the first and second cameras to thereby control the images 14, 26 that are captured. The reader should appreciate that this prevents unwanted footage being captured such as running tracks around the outside of the playing field or images of the edge of the crowd or empty seats.
  • Figure 3b illustrates the variable relationship between primary image 14 and halo image 26 dependent on the position of the track node 11 within the bias zone 6 and the direction of travel 68 which is set at 50% left. The illustration shows that when the direction of travel 68 is 50% left then the track node 11 is + 50 th percentile within the halo image 26 location field's y axis throughout the bias zone 11 , until the halo image 26 collides with the bump bar 100, at which time the halo image 26 stops and the primary image 14 is allowed to slide to the left within the halo image 26.
  • Figure 3b also shows that when the track node is on the bias zone's 80% x axis 31 alignment then the secondary image location field has the track node on its' 80% x axis 17a alignment. Similarly when the track node is on the bias zone's -40% x axis alignment then the secondary location field has the track node on its' -40% x axis alignment. And once again when the track node is on the bias zone's -80% x axis alignment then the secondary location field has the track node on its' -80% x axis alignment.
  • the centre of the image's X & Y axis is 0% and the image's limit lines 19 are +/- 100%.
  • the properties of the bias zone can also be changed, and this includes both linear and logarithmic relationships between bias zones and the track node's position within the location field. Multiple overlapping bias zones can be used together, which enables an averaging of the bias zones effects on the image's position around the track node. This enables the halo cameras to have a particular bias towards a geographical location such as a soccer goal.
  • Concentric circle bias zones as in Figure 3c & 3d work in a different manner to those discussed previously. Concentric circle bias zones control the halo image's position around the track node. This is enabled by creating an alignment line 19a between the track node 11 and the centre of the bias zone 6 which is extended at the track node end so as to bisect the primary image, or alternatively the alignment line is extended an addition percentage or offset distance.
  • the operator's preset options include: fixing the size of the secondary image as per figure 3c; enabling the size of the secondary image to expand and contract while always keeping the centre of the bias zone and primary image within its limit line as per figure 3d; enabling the primary image to positioned within the secondary image in accordance with typical bias zone methods as per figure 3c; and to have the primary image always tangential to the secondary images' limit line as per figure 3d.
  • the method as illustrated in figure 3c & 3d are useful in numerous sporting applications where goals are being used and the television viewer's focus of attention is generally where the game ball is and where the goals are. This would be the case in soccer, netball, ice hockey and basketball. Similarly in cricket, where the entire cricket pitch can be part of the bias zone centre which is always within a cameras halo as is the ball as it is hit around the cricket grounds.
  • Figure 3b shows that the track node is central within primary image regardless of the track nodes' direction of travel or the nodes position within the bias zone, although the primary image does have the functionality as the halo image to have the track node offset within itself dependant on direction of travel and the track node's location within the bias zone.
  • Primary and halo images can have a preset maximum and minimum size.
  • the centre of the image's axis are 0% and the limit line are + /- 100% in all axis. Both a linear and a logarithmic relationship can be used between the direction of travel and the track node's position within the location field.
  • the bias zones, images and images location fields may all be
  • 3D spatial structures working in similar methodologies as previously described, although have 3D properties.
  • Adopted 3D structures may include spheres, cylinders, cones, or rectangular prisms.
  • a GPS tag would typically be used to establish real time 3D location of the track node.
  • the virtual map of the court 76 stored on the control means 50 is in three dimensions.
  • the virtual map includes a cutting plane 92, which is used to control the plane on which the images 14, 26 move.
  • the height of the cutting plane 92 can be varied.
  • the position of the stylist 54 on the cutting plane typically generates the location of Track Node.
  • Figure 4a illustrates an area 94 or image that a number of cameras may be focused on.
  • the ball is typically passed at chest height hence the cutting plane is located at chest height as per figure 4a.
  • Activity in soccer generally occurs at ground level, hence the cutting plane 92 would be lowered accordingly.
  • the virtual map includes barrier 96, which inhibits the vertical movement of the field of view 18 (figure 1) above a certain plane.
  • the barrier 92 can be either parallel to the playing surface 76, as illustrated in figure 4a or may take any form or shape, including being sloped upwardly from a mid point of the court to the opposing goals 82, 84, as illustrated in figure 4b.
  • the barrier 92 above the playing surface acts like a virtual roof and prevents footage being captured of unwanted detail such as empty spectator stands.
  • the subject distance 20 When a target is in correct sharp focus, then the distance between the focal point of the lens and the target is known as the subject distance 20.
  • the end point of the subject distance may be coupled to the object 16 or the centre of the halo image 14, 26.
  • the plane of the halo image 26 can be offset from the plane of the primary image 14. This action may occur from a bias zone interaction affecting only halo image 26.
  • the Image's 26 position enables both the basketball hoop 82 and player 64 to be in shot, and for the focus to be as sharp as possible.
  • the primary and halo image may be uncoupled where by one halo image tracks an object such as a ball while the other halo image is trained in a prescribed manner onto the landing zone of the ball which is calculated via the balls trajectory.
  • This function can be activated by the operator or be automatic.
  • cameras 12, 12a, 12b are used to capture respective primary images 14 and cameras 24 and 24a are used to capture respective halo images 26.
  • each camera can have its own halo image and bias zone, and as such the number of halo sizes at any one time is only limited by the number of cameras. Accordingly, this gives the operator greater flexibility in selecting a suitable image for broadcasting.
  • the apparatus 10 can be used to provide footage of a soccer game being played on a soccer field 34.
  • the present example includes plays 94 and 96 that will be used to illustrate to relationship between the primary and halo images 14 and 26.
  • the first play 94 starts at the kickoff from the centre circle, when the ball is located on the centre spot.
  • the primary image 14 is positioned at a centre point of the halo image 26, as illustrated by event 98. This means that all players within the vicinity will be included in the halo image 26.
  • event 100 As play progress and player 64 runs down the field, as illustrated by event 100 the primary image 14 is positioned towards the trailing edge of the halo image 26. This means that the halo image extends forward of the player 64 even when the player changes direction as illustrated by event 102.
  • event 104 the halo image 26 is inhibited from extending beyond the bump bar 86.
  • halo image 26 could be large enough to capture the players in front of the goal 84.
  • the ball is then kicked to centre and directed into the goal 84 as illustrated by event 108.
  • the halo image 26 captured by camera 24 also changes orientation to include the goal and goalie.
  • the invention provides an apparatus and method of controlling a plurality of cameras to capture footage of a sporting or stage event.
  • the use of at least a first camera that captures a primary image that conforms to the target object and a halo image captured by a second camera having a wider field of view means that a single operator can simply and effectively control the composition of the television broadcast.
  • the use of a central control unit enable the operator to control a number of cameras by simply passing a stylus over the surface of a touch screen displaying live footage of the sporting arena.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Computational Linguistics (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

In one aspect the invention provides an apparatus and method of controlling a plurality of cameras to capture footage of a sporting event. The method of obtaining motion picture footage of a moving object includes the steps of, capturing a dynamic primary image of said object using a first motion picture camera, and capturing a dynamic halo image that extends around said primary image using a second motion picture camera, wherein the position of said dynamic primary image within said dynamic halo image can be altered. The use of at least a first camera that captures a primary image that conforms to the target object and a halo image captured by a second camera having a wider field of view means that a single operator can simply and effectively control the composition of the television broadcast.

Description

A method and apparatus for relative control of multiple cameras
FIELD OF THE INVENTION
The present invention relates generally to the field of camera control systems and in one aspect relates to the control of at least two cameras for capturing different images of an object moving across a surface wherein a primary image is contained, and movable within, a halo image, the position of the halo image being dependent upon the movement of the object.
BACKGROUND OF THE INVENTION
Televised sporting events are extremely popular on both free-to-air and pay television, with many channels being solely dedicated to sport. With the advent of more advanced camera technology, quality has increased and new camera shots have been achieved. Cameras located in cricket stumps and inside race cars are now common.
Many sporting activities, such as football and basketball, require complex shot sequences captured using a traditional tripod mounted movable camera controlled by skilled camera operator trained to capture the live action. The present invention provides an alternative where by the cameras can be controlled automatically using servos and encoders enabling an autofocus, and auto zoom, auto pan and tilt. This system enables the camera to receives control signals from a control means to facilitate the capturing of imagery of the game. The cost of placing a skilled camera operator behind each camera is one of the limitations of the manually controlled systems. Furthermore due to health and safety issues regarding the operator, the placement of cameras around the perimeter of the playing field is restricted. A further limitation of a manually controlled system is that camera operators can obscure the action of the sport or stage productions, when close ups are needed as is the case with boxing and ice hockey.
There are numerous automated camera control systems currently available. Most of these systems fall within two categories, namely control systems that utilises a tagged objects, and master/slave camera control systems. System using tags can however be simplistic, and do not provide for the framing and compositional variables that are required for modern day television broadcasting. On the other hand one of the problems with master/slave systems is that the images captured by the slaved cameras are the same as those captured by the master camera, the only difference being that the angle from which the image is captured is different for each camera. It should be appreciated that any discussion of the prior art throughout the specification is included solely for the purpose of providing a context for the present invention and should in no way be considered as an admission that such prior art was widely known or formed part of the common general knowledge in the field as it existed before the priority date of the application. SUMMARY OF THE INVENTION
In accordance with an aspect of the invention, but not necessarily the broadest or only aspect there is proposed a method of obtaining motion picture footage of a moving object including the step of:
capturing a dynamic primary image of said object using a first motion picture camera; and
capturing a dynamic halo image that substantially extends around said primary image using a second motion picture camera, wherein said first and second motion picture cameras being controlled such that the position of the halo image relative to the primary image can be altered. The first and second motion picture cameras being controlled such that primary image retains a portion of the halo image and the position of the halo image relative to the primary image can be altered.
The object may be a ball being used in a sporting contest, wherein the primary and halo images include motion picture footage of at least the ball. The primary and halo images may further include motion picture footage of an individual or individuals engaged in the sporting contest, goals, wickets or relevant line markings.
The quality and framing of the dynamic primary image is defined by the field of view (zoom) and subject distance (focus) of a lens of said first camera and the camera's alignment on the servo pan tilt head. The quality and framing of the dynamic halo image is defined by the field of view and subject distance of a lens of the second camera and the camera's alignment on the servo pan tilt head. The shape of the primary image and halo image can be, but are not limited to circles, ovals, squares and rectangles.
In one form the primary image and halo image, defined by respective field of views and subject distances, can be altered. This is important because the composition of camera footage that is the most desirable for a viewer will vary depending upon the behaviour of the player or players engaged in play. In this way close up footage of the object, such as a particular sports player can be captured with one camera whilst automatically capturing with secondary cameras the wider area around the player that may include opposing players that may contests for the ball, or team mates to which the ball may be passed.
In another form the object being tracked is a ball being used to play a sport such as soccer or basketball and the motion picture primary image and halo images move thereby include the ball and individual or individuals engaged in play or other images of audience interest. The term play referred to the progress of the game in which the individual player or players are actively engaged in.
As a player runs down the field the halo image may be positioned forwardly of one side around the primary image, wherein the halo image extends forward of the player and includes defending players that are in close proximity to the first player and that may engage them in play within a short period of time.
The method may use at least one primary image contained within at least one halo image. Typically an operator may use the halo image or multiple halo images. In another form the primary and halo images may be locked onto a predefined object, including an RF tag or movable point herein referred to as a track node, which may follow the game ball, player or vehicle. The reader should appreciate that throughout the specification the term track node refers to a series of points having x, y, z coordinates within a mathematical model that is created by surveying and mapping the surface of a selected area. The track node may replicate, within the
mathematical model, the actual movement of a selected object across the mapped surface or alternatively it may replicate the movement of a pointer across a touch screen. The size of the primary and halo images can be individually adjusted. The images' size can also be set as either a percentage of the primary image, or as an adjustable fixed size, or as a variable logarithmic percentage of the primary image. The size of a halo may also be determined via the position of the track node within a bias zone. The bias zone may have predefined parameters that control the position of the primary and halo images around the tagged object or track node. The predefined parameters are preferably stored in software.
Primary and halo images are preferably controlled by software to facilitate the often complex requirements of correct framing of any given sport or activity. The following basic summary alerts the reader to some of the complexities of these interactions. The images encircle the tracked object, have offset limit lines that keep the tracked object within specified boundaries. These boundaries can be thought of as a fence that stops the tracked object from exiting. The images also have location fields within the limit lines. The location field positions the image around the tracked object depending on the tracked objects position within the bias zone which typically covers the entire playing arena and the direction of travel which is an Operator adjusted function. The space where images can be moved is also restricted by the bump bars, which are typically located just outside the boundary of the playing field or performance space. The reader should now appreciate that to fully understand the functionality of capturing the images, the reader must also appreciate the interrelated functions of the other software functions. Further detailed descriptions of these functions are contained in subsequent sections.
The images may have limit lines, which are parallel line to the image's external edge that can be offset at specified distances or at a percentage of the image's diameter or longest side. Images are designed to capture the tracked object or track node within the image's limit lines. The limit lines effectively give the object or player being framed some space around them before the edge of the television picture frame. The limit lines also have a variable cushioning effect that enable the track node to have a range of hard to soft collisions with the limit line. This cushioning effect enables a smoother visual motion picture without jerky changes in direction. On specified occasions the limit lines can be outside the image, thereby enabling the track node to be captured but still outside the image. The limit lines can be offset from the outside edge of the image, and the methods of offset include, a specified distance, specified percentage of the diameter or diagonal, and a combination of both percentage and specified minimum and maximum distances.
In still another form the relationship between the primary and halo images is relative to, and controlled by, a control means. In one form the size of the primary image may be proportional, to the halo image. This proportion relationship may be directly or inversely proportional or be linear or exponential.
In yet another form each image has a location field that consist of an x, y and z axis that typically bisects through the centre of the image. Location fields have variable patterns, which include but are not limited to, orthogonal patterns with one or two axis, curved grid patterns, parabolic patterns, or concentric circle patterns. The track node which is the object being tracked, interacts with the following; the location fields, the direction of travel, and the bias zones to enable the correct motion picture framing of the tracked object within the televisions picture frame. In one form the location field adjusts the position of the track node along its x axis, proportion to the direction of travel of the track node. The location field adjusts the position of the track node along its y axis, proportion to the track node's position within the bias zones. Further information on the methods of interaction between track nodes, location fields, direction of travel and bias zones are contained in subsequent sections. The images movement, size, position and relationship with each other, may vary depending on the tracked object's velocity, direction of travel, behaviour, position with the bias zone and relative direction with respect to the physical location of the first or second camera.
The relationship may also be altered depending upon the character of the object being tracked. For instance where a player is being tracked their movement and behaviour will be restricted to a narrow flat band adjacent a playing surface. In contrast the movement and behaviour of a football being kicked would be quite different and would be within a broader band that extends upwardly from the playing surface. Accordingly the relationship may be altered by the trajectory or expected trajectory of the ball. In such a situation the dynamic primary image may follow the trajectory of the ball whist the dynamic second halo image may capture footage of the expected landing area that has been calculated from the trajectory of the ball. Typically the primary image is positioned within the halo image, although it should be appreciated that the halo image may be separated from the primary image. For instance when a player is attempting a shot at the goal the halo image may be uncoupled from the primary image such that the second camera is directed at the goal when the track node or ball comes into contact with the specified area. The uncoupling of the halo image from the primary image may be done automatically by way of computer software when the target object is located within a predetermined space such as the goal square. Alternatively this uncoupling can be performed via the user interface and in one form a switch may be used. The uncoupling of the images or halos may also occur when footage of the crowd, coach's box, or other predetermined areas is required. This uncoupling and repositioning of the second camera may be performed by separate control switches.
In still another form multiple halo images can surround the primary image and each halo image can have its own specified size. The capturing of the images is controlled by software that may include, bias zones, bump bars, direction of travel, framing limit lines, split button, and proportional head room framing. Individual halo images may be able to interact with the software while the primary image may not interact. The operator can individually activate or deactivate each image's interaction with the software. In accordance with a second aspect of the invention there is proposed a motion picture capturing apparatus including, a first camera for capturing a dynamic primary image of a moving object, a second camera for capturing a dynamic halo image that extends around the primary image, and a control means for controlling the position of said dynamic halo image around the said dynamic primary image. In accordance with the above apparatus the first camera and all secondary cameras are controlled by servo-assisted pan tilt heads and servo assisted lenses that control the focus and zoom. In one form the control means further controls the pan, tilt, zoom and focus of the respective first and all secondary cameras.
The relationship between the primary image and all halo images may be altered by use of the control means that may include a user interface and designated software. This user interface may include a touch screen, which shows live video and a synchronised 3D model of the playing area. The control means may require the synchronisation of the virtual 3D computer generated environment with a camera's real world view of the same environment. This synchronisation enables the operator to see the overlayed 3D model, such as a soccer field line markings, over the video. This enables the operator to working in the 3D model computer world while still seeing what is happening via the video. This synchronisation typically requires: the calibration and charting of the servo encoded lens's zoom and focus; a 3D model of the environment created either by surveying the environment or by having a knowing standard environment such as tennis court; the cameras having known 3D locations with associated x, y, z coordinates and the pitch and yaw of the horizontal plane of the camera head is also known; and each camera being mounted onto a servo encoded pan tilt head.
This synchronisation enables a computer to determine the camera's field of view via the encoder's reading of the pan, tilt, zoom and focus settings. As a result the operator sees an accurate virtual 3D model superimposed over the real world video. Thus when a camera's field of view moves, then the synchronised 3D model also precisely moves in real time. This synchronisation now enables one human operator to accurately command and control in real time multiple cameras around a designated area and see the camera vision and the superimposed 3D geometric and spatial software functions working. This can enable far superior accuracy of framing and focusing on dynamic targets.
In one form the control means further includes a broadcast switching device to enable the operator to select the footage that is to be broadcast or recorded. The component of the apparatus, such as the cameras, display means and control means may be connected by way of a communication means such as, but not limited to, a modem communication path, a computer network such as a local area network
(LAN), Internet, RF or fixed cables. This means that a user can control the operation of multiple cameras from a single location.
In another form the processor and the memory cooperate with each other and with other components of a computer to perform all of the functionality described herein. In another form the processor executes appropriate software to perform all of the functionality described herein. In still another form the control means is a computer including RAM and ROM memory, a central processing unit or units, input/output (10) interfaces and at least one data storage device. The computer includes application software for controlling the cameras and performing functions, stored in a computer readable medium on a storage device. The apparatus may include embedded software or firmware with corresponding hardware that is designed to perform one or more dedicated functions. In one form the processor and the memory cooperate with each other and with other components of a computer to perform all of the functionality described herein.
In a third aspect of the invention there is proposed a software program for controlling the operation of the preceding apparatus and for the application of the preceding and following methods.
In one form the control means includes a computer monitor with a virtual model or map of the playing surface which overlays in real time over the
synchronised camera, which has the same perspective as the virtual model. The virtual model may include such things as the boundaries of the playing surface, goals and relevant line markings. It is within the computer model that the operator can command and control and see the various geometric and spatial software functions working over the camera's video.
In accordance with a fourth aspect of the invention there is proposed a motion picture capturing apparatus including, a first camera for capturing a dynamic primary image of a moving object, at least one second camera for capturing a dynamic halo image that substantially extends around the primary image, and a control means for controlling the position of said dynamic images.
In one form a track node may be stored within software to facilitate the positioning of the said primary and halo images. Track nodes are mathematical points that can be assigns to track vehicles, players or the match ball to give them a positional reference. The real time position of the track node is governed by, but not limited to GPS devices, RF tagging devices, optical recognition devices, and manual tracking using either a mouse or a stylist on a touch screen. Images can be individually assigned to specified track nodes. Track node can spatially interact with the images in a variety of ways. A track node may be locked onto the cutting plane there by setting the height of the track node away from the playing surface, while allowing the track node to travel across the cutting plane in any direction, speed and acceleration. The track node can also be offset from the cutting plane in a variety of methods that include but not limited to, a wheel on a mouse, a wheel within a control interface, and depressing a button and using a touch screen stylist to move the stylist either up or down the touch screen.
The computer uses the position of the track node to calculate the subject distance for the lenses' focus settings, thereby enabling the area around the track node to always be in focus. The subject distance is the distance from the lens to the subject or tracked target. Multiple track nodes can be utilised where there are multiple targets requiring tracking. Nominated cameras can be exclusively assigned to specified track nodes while interacting with the software devises.
In accordance with a fifth aspect of the invention there is proposed software functions herein referred to as a cutting plane. The cutting plane enables the images to have the z-axis position as the cutting planes surface. The cutting plane is a mathematical plane contained within software that is offset from the playing surface at variable heights. The plane can be parallel to a designated surface, or it can be a curved or variable surface over the playing field or surface. The cutting plane can also be shaped into any profile such as a plane that is offset 1 meter and parallel to a complex and undulating motor racing track. Typically cutting planes will extend well beyond the primary playing area into secondary areas, such as the surrounding playing areas, grand stands and vehicular run off areas. The primary function of the cutting plane is to allow the track nodes, and thereby the captured images to travel across the cutting plane's surface or be offset from it. The cutting plane enables better accuracy when tracking motor vehicles because the vehicles height from the racing track is always known (unless the vehicle is flying), therefore GPS tracking inaccuracies in the Z direction or height can be removed.
In one form a bias zone contained within the software interacts with the track node's position within the bias zone to dictate how the images are positioned around the track node. Bias zones have variable patterns that include but are not limited to: orthogonal patterns with one or two axis, or concentric circle or oval patterns. The track node may travel either side of the bias zones' x axis and the further the track node is away from the x axis, then the further away the track node is from the image's x axis while still staying within the image's limit line. Multiple bias zones may also be utilised, for example an orthogonal bias zone covering an entire soccer field and two concentric circle bias zones each with a 30m radius centred on each goal. The resultant effect on the halo images around the track node is based on the averaging of the two bias zones effect, which of course is dependant on the track nodes position with the bias zones.
In still another form a direction of travel function may be stored within the software and in one form may be manually controlled via an adjustable slide device which as a neutral middle position and variable forward and back calibrations. The direction of travel creates leading space forward or behind the track node within the images. The further the slide is away from its neutral position, then the further the halo image's centre is offset from the track node. For example, 90% forward on the slide, results in the track node being located 90% back from of the images centre, there generating a very large leading space within the halo image in front of the track node. The magnitude of the leading space or distance between the track node's position and the offset from the image location field's y axis, is proportional to the magnitude of the direction of travel. Which side of the images that the leading space occurs, is governed by the operator and is typically dependant on which way the ball is going.
In still another form a bump bar function may be stored within the software. Bump bars are a software spatial ordering function that enable the images to bump into them, but generally do not let the images pass over their geometric alignment. Bump bars are like a fence that can be aligned where required, to frame the perimeter of the playing field. Bump bars have a variable deceleration setting that enables the halo images to cushion into the bump bars before contact occurs.
The images have 3 optional functionalities that enable them to, firstly, recognise bump bars and cushion into them, secondly to ignore the bump bars and their associated functions, and thirdly a hybrid option where the halo images use the bump bars until the primary halo crosses the bump bar at which point the halo image will continue to surround the primary image as both images cross over the bump bars. The bump bars stop the specified images from departing the area of the playing field, thereby keeping the cameras field of view on the playing surface and on the players.
In yet still another form a picture frame function may be stored within the software. The picture frame is a software ordering function that graphically shows the camera's "16 x 9 picture plane" around the captured image. The sides of the picture frame always touch the images' external edges relative to the viewing alignment of the camera. As such if the image expands then the picture frame expands.
The sill and head heights of the picture frame and the centre of the picture frame can be set in a variety of methods. Firstly, the bottom alignment of the picture frame or sill can have an vertical offset distance from either the cutting plane or track surface at the track node's location, secondly the picture frame can be set so that a specified horizontal axis or band of the picture frame always retains the track node on it while the picture frame holds the entire captured image, and thirdly the side of the captured image closest to the camera will rest on the picture frame's sill. An additional over riding function on the height of the picture frame head height is the proportional head room function which interacts with the size of the images and the height of the cutting plane so that when the picture frame's top alignment has reached a certain specified height above the playing surface, then the picture frame's height will not drop any further and if the picture frame needs to reduce in size because of a contracting image size, then the picture frame's bottom alignment or sill will rise allowing for the picture frame to shrink in size. This proportional framing function can also be used in an inverse fashion, so that the operator can zoom in on the player's feet in a similar manner. Picture frames and the visual limit plane have a geometric relationship that stop the picture frame from passing across a visual limit plane.
In a further form a visual limit plane function may be stored within the software. A visual limit plane is of any size and shape that can be positioned at any horizontal, vertical or angular alignment. The visual limit plane is a spatial software function that enables the camera's view to be restricted from looking past a specified alignment or plane. The visual limit plane affect the camera's zoom, pan and tilt. In a typical sporting application like soccer, the visual limit plane will be located just under the roof line of the stadium, and when the wide field of view camera and its associated wide image are tracking a player on the far side of the field then the head of the picture frame would contact the visual limit plane and stop the camera's field from seeing under the stadium roof and push the camera's field of view further onto the playing field where the action is. Visual limit planes can be set individually for each camera and are particularly useful when located just under the roof of stadiums, stage boundaries, or edges of unsightly structure. The operator can set the visual limit planes and bump bars in appropriate positions within the 3D model which is superimposed over the real time video and examine all camera views for functionality and aesthetic composition. In still another form a split button function may be stored within the software and enables the operator to push a button, there by releasing the specified images from the cutting plane to follow a target such as a basketball through a' path of travel. When the split button command is activated, the system recognises the track node's location and draws a base line from that point to the designated target point which can be the centre of the basketball or netball hoop. In basketball, the operator can depress the split button and then track the flying ball through the air using the stylist on the touch screen. Assuming the ball is directed at the hoop, then the 3D model understands the base line direction of travel and the vertical offsets created by the flight of the ball. This enables the cameras to follow the ball's flight path. In still a further form an image tally light function may be stored within the software. The image tally light indicated to the operator which camera is being used at any given moment as the live feed. The image tally light may highlight the live feed camera's halo or picture frame.
In yet still another form a vista line function may be stored within the software and creates a series of lines within the virtual 3D computer model that start at a camera location and extend to the tangent points on both sides of that camera's images. The lines may be terminated at either the image's tangents, or cutting plane ,or designated distance past the image. Similarly the centre vista line starts at the camera location and extends to the track node and may terminate as at the track node, or cutting plane, or designated distance. In still yet a further form a hierarchy of commands function may be stored within the software. Many of the aforementioned functions interrelate with each other and in some circumstances may desire to over ride each other. As such a hierarchy of commands is structured within the system requirements, enabling commands to over rule other commands.
In yet still a further form a relative zoom points function may be stored within the software. This software function enables a point on the cutting plane to be selected i.e. the soccer goals, and for that point to stay in the same location within the camera's field of view as the operator zooms in or out either by manual controls or in a preset manner. This software command can also utilise the camera's picture plane via the systems understanding of the lens's field of view.
In another form a pan point function may be stored within the software and enables the operator to select two points, a genesis point and a terminus point, where by the designated camera will pan between these points along a designated path. This designated path or spline can be adjusted by the operator to form any alignment within a 3D space. The zoom setting or key framing at the genesis and terminus points and at any number of points along the spline can be designated so that the lens' zoom extrapolates evenly between them as the camera's centre of view pans along the spline. Time, zoom settings, and speed between the pan points can be specified.
BRIEF DESCRIPTION OF THE DRAWINGS
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate implementations of the invention and, together with the description and claims, serve to explain the advantages and principles of the invention. In the drawings,
Figure 1 is a schematic view of a primary image and the surrounding halo
image, defined by a respective field of view and subject distance;
Figure 2 is a schematic view of a first embodiment of the apparatus for camera control of the present invention; Figure 3a is a schematic view of the various configurations of the primary image area and surrounding halo image area of figure 1 illustrating the bump bars around the periphery of the playing arena;
Figure 3b is a schematic view of a primary and halo images and their interaction pattern as they move within the bias zone, showing that the interaction pattern is firstly based upon the position of the track node within the bias zone and secondly the position of the bump bars;
Figure 3c is a schematic view of a fixed size primary and halo images and their interaction pattern as they move within the circular bias zone;
Figure 3d is a schematic view of a fixed size primary image and variable size halo image and their interaction pattern as they move within the circular bias zone;
Figure 3e is a schematic view of a halo and its component parts;
Figure 3f is a schematic view of some of the embodiments of a halo;
Figure 3g is a schematic view of a bias zone and its component parts;
Figure 3h is a schematic view of some of the embodiments of a bias zone;
Figure 4a is a schematic view of the primary image of figure 1 illustrating a first embodiment of the vertical barrier above the playing surface;
Figure 4b is a schematic view illustrating a second embodiment of the vertical boundary above the playing surface;
Figure 5 is a schematic view illustrating a further embodiment; and
Figure 6 is an overhead view of the movement of a player across a playing surface illustrating the position of the images captured by the first and second cameras.
DETAILED DESCRIPTION OF THE ILLUSTRATED AND EXEMPLIFIED EMBODIMENTS
There are numerous specific details set forth in the following description. However, from the disclosure, it will be apparent to those skilled in the art that modifications and/or substitutions may be made without departing from the scope and spirit of the invention. In some circumstance specific details may have been omitted so as not to obscure the invention. Similar reference characters indicate corresponding parts throughout the drawings.
Referring to the drawings for a more detailed description, a motion picture capturing apparatus 10 is illustrated, demonstrating by way of examples
arrangements in which the principles of the present invention may be employed. As illustrated in figure 1 , the motion picture capturing apparatus includes a first camera 12 for capturing a dynamic primary image 14 of an object 16, the primary image 14 being defined by the field of view 18 and subject distance 20 of the lens 22 of the first camera 12. The apparatus 10 further including a second camera 24 for capturing a dynamic halo image 26 that contains and extends around the primary image 14, the halo image being defined by the field of view 28 and subject distance 30 of the lens 32 of the second camera 24. The dimensions of at least the halo image 26 and the position of the primary image 14 therewithin may be altered depending upon the direction of travel and behaviour of the object 16. As illustrated in figure 2 the apparatus 10 can be used to capture footage of a sporting contest, such as a game of soccer. The first and second cameras 12, 24 are placed around a playing surface in this example being a soccer field 34 having a boundary line 36, various field markings 38 and opposing goals 40, 42. A third camera 44 is configured to capture an image 46 of the playing field 34. Signals are received from and sent to cameras 12, 24 and 44 by way of communication means 48. The communication means 48 may be hard wired to the cameras or be connected by way of a transmitted/receiver.
The communication means 48 is connected to a control means 50, including a touch screen 52, for displaying image 46, and stylus 54, for controlling the images captured by the first and second cameras 12, 24, and a broadcast switcher 56 in communication with a broadcast tower 58 for controlling the television images broadcast. The broadcast switcher 56 includes switches 60, 62 for selecting the desired images for broadcasting.
As further illustrated in figure 2 the object 16 is a soccer player 64 who is kicking a ball 66 down the field 34 in the direction of arrow 68 which indicates the direction of travel. The direction of travel is communicated to the apparatus 50 via the joystick 74. When in use the image 46 of the field is displayed on the touch screen 52. The operator uses the stylus 54 to positions the track node 11 in the centre of the play between the soccer player 64 and the soccer ball 66. The size of the images can be controlled via the rotation of the joystick's knob 75. The movement of the stylus 54 across the display means 52 generates digital signals representative of the required panning, tilting, focusing and zoom operations of the cameras 12, 24 and their lenses 22, 32 to track an object 16 across surface 34.
The operator can either select to follow an individual player that is in control of the ball or the ball itself depending upon the required shots and whether the ball is being passed between players. The movement of the stylus 54 across the screen 52 results in corresponding movement of cameras 12, 24. It should however be appreciated that the users finger or tracking subsystems could be used instead of the stylus 54 to track movement of the object 16 across the touch screen 52. The stylus 54 is used to control the first camera such that the track node 11 of the primary halo corresponds to the position of the stylus 54 on the image 46 displayed on the screen 52. In the present embodiment, the position of the stylus 54 controls the position the halo 26 around the primary image 14.
In another embodiment as illustrated in figure 2, the images 14, 26 captured by the first and second cameras 12, 24 are displayed on screens 70, 72. The screens 70, 72 are used so that the operator can select the best image for broadcasting. The reader should however appreciate that the display means 52 may include the images captured by the cameras or the apparatus may include a separate split screen displaying the images captured by the various connected cameras.
The apparatus 10 utilises a joystick 74 for controlling the direction of travel although in another form this joystick 74 can be used for controlling the position of the images around the track node 11. The joystick knob 75 may also be used to control the dimensions of the primary and/or halo images.
The computer includes application software for controlling the computer, receiving data from the screen 52, stylus 54 and joystick 74. The software is configured to generate appropriate signals to control the servo-assisted camera heads and encoded lenses that control pan, tilt, focus and zoom of the cameras 12, 24 depending upon the signals received from the screen 52, stylus 54 and joystick 74. Application software may be stored in a computer.
The lenses 22, 32 are calibrated either by using the manufactures data or by setting up the camera and lens in a known environment and recording the focus and zoom settings at variable distances and variable fields of view. Encoders recognise these focus and zoom settings and this data is stored, alternatively the analogy settings of the lens may be used but will not be as accurate. System algorithms utilise this data to enable automated lens control. Thus focus for each lens is achieved by knowing the distance between the camera location and the track node 11. The lens's zoom is achieved by knowing the size of the halo 14 and the distance between camera 12 and halo 14 then applying the calibrated lenses' algorithms to facilitate the correct field of view (zoom). The camera's servo driven pan tilt heads are also encoded thereby enabling the system to recognise, command and control the direction of the camera's alignment. The camera control system can be used to record images of various sporting activities. As illustrated in figure 3a, the apparatus 10 can be used to capture footage of a basketball game played on a basketball court 76 having court markings 78, a boundary line 80 and opposing hoops 82 and 84. In one embodiment the control means 50 includes a virtual map of the surface of the playing surface. This virtual map includes respective court marking, boundary line and position of the basketball hoop. The vertical map also includes a virtual barrier or bump bar 86 that constrains the movement of the first and second cameras to thereby control the images 14, 26 that are captured. The reader should appreciate that this prevents unwanted footage being captured such as running tracks around the outside of the playing field or images of the edge of the crowd or empty seats.
As illustrated in figure 3a, when the cameras 12 and 24 are located above the playing surface, the edges of the respective field of views of cameras 12, 24, and therefore the images 14, 26 that are captured, are restrained from crossing the bump bar 86. In a situation, as illustrated by event 88, when the object 16 being tracked is at a distance from the boundary line 80, the operator can control the position of the primary image 14 within the halo image 26. However when the object comes into close proximity to the boundary line 80 as illustrated by events 90a, 90b and 90c the relationship between the two images 14, 26 is automatically altered by interaction with the bump bar 86. The dimensions or orientation of the halo image 26 and the primary image may both be changed. In this way the bump bar 86 acts like a cushioning fence adjacent the boundary of the court to prevent unwanted footage being captured. Figure 3b illustrates the variable relationship between primary image 14 and halo image 26 dependent on the position of the track node 11 within the bias zone 6 and the direction of travel 68 which is set at 50% left. The illustration shows that when the direction of travel 68 is 50% left then the track node 11 is + 50th percentile within the halo image 26 location field's y axis throughout the bias zone 11 , until the halo image 26 collides with the bump bar 100, at which time the halo image 26 stops and the primary image 14 is allowed to slide to the left within the halo image 26. Figure 3b also shows that when the track node is on the bias zone's 80% x axis 31 alignment then the secondary image location field has the track node on its' 80% x axis 17a alignment. Similarly when the track node is on the bias zone's -40% x axis alignment then the secondary location field has the track node on its' -40% x axis alignment. And once again when the track node is on the bias zone's -80% x axis alignment then the secondary location field has the track node on its' -80% x axis alignment.
The centre of the image's X & Y axis is 0% and the image's limit lines 19 are +/- 100%. The properties of the bias zone can also be changed, and this includes both linear and logarithmic relationships between bias zones and the track node's position within the location field. Multiple overlapping bias zones can be used together, which enables an averaging of the bias zones effects on the image's position around the track node. This enables the halo cameras to have a particular bias towards a geographical location such as a soccer goal.
Concentric circle bias zones as in Figure 3c & 3d work in a different manner to those discussed previously. Concentric circle bias zones control the halo image's position around the track node. This is enabled by creating an alignment line 19a between the track node 11 and the centre of the bias zone 6 which is extended at the track node end so as to bisect the primary image, or alternatively the alignment line is extended an addition percentage or offset distance. The operator's preset options include: fixing the size of the secondary image as per figure 3c; enabling the size of the secondary image to expand and contract while always keeping the centre of the bias zone and primary image within its limit line as per figure 3d; enabling the primary image to positioned within the secondary image in accordance with typical bias zone methods as per figure 3c; and to have the primary image always tangential to the secondary images' limit line as per figure 3d. The method as illustrated in figure 3c & 3d are useful in numerous sporting applications where goals are being used and the television viewer's focus of attention is generally where the game ball is and where the goals are. This would be the case in soccer, netball, ice hockey and basketball. Similarly in cricket, where the entire cricket pitch can be part of the bias zone centre which is always within a cameras halo as is the ball as it is hit around the cricket grounds.
Figure 3b shows that the track node is central within primary image regardless of the track nodes' direction of travel or the nodes position within the bias zone, although the primary image does have the functionality as the halo image to have the track node offset within itself dependant on direction of travel and the track node's location within the bias zone.
Primary and halo images can have a preset maximum and minimum size. The centre of the image's axis are 0% and the limit line are + /- 100% in all axis. Both a linear and a logarithmic relationship can be used between the direction of travel and the track node's position within the location field. In another form the bias zones, images and images location fields may all be
3D spatial structures working in similar methodologies as previously described, although have 3D properties. Adopted 3D structures may include spheres, cylinders, cones, or rectangular prisms. In this instance a GPS tag would typically be used to establish real time 3D location of the track node. As illustrated in figure 4a and 4b the virtual map of the court 76 stored on the control means 50 is in three dimensions. In the present embodiment the virtual map includes a cutting plane 92, which is used to control the plane on which the images 14, 26 move. The height of the cutting plane 92 can be varied. The position of the stylist 54 on the cutting plane typically generates the location of Track Node. Figure 4a illustrates an area 94 or image that a number of cameras may be focused on. In basketball the ball is typically passed at chest height hence the cutting plane is located at chest height as per figure 4a. Activity in soccer generally occurs at ground level, hence the cutting plane 92 would be lowered accordingly.
As further illustrated in figures 4a and 4b the virtual map includes barrier 96, which inhibits the vertical movement of the field of view 18 (figure 1) above a certain plane. The barrier 92 can be either parallel to the playing surface 76, as illustrated in figure 4a or may take any form or shape, including being sloped upwardly from a mid point of the court to the opposing goals 82, 84, as illustrated in figure 4b. The barrier 92 above the playing surface acts like a virtual roof and prevents footage being captured of unwanted detail such as empty spectator stands. When a target is in correct sharp focus, then the distance between the focal point of the lens and the target is known as the subject distance 20. The end point of the subject distance may be coupled to the object 16 or the centre of the halo image 14, 26.
As illustrated in figure 5, the plane of the halo image 26 can be offset from the plane of the primary image 14. This action may occur from a bias zone interaction affecting only halo image 26. The Image's 26 position enables both the basketball hoop 82 and player 64 to be in shot, and for the focus to be as sharp as possible.
In another form the primary and halo image may be uncoupled where by one halo image tracks an object such as a ball while the other halo image is trained in a prescribed manner onto the landing zone of the ball which is calculated via the balls trajectory. This function can be activated by the operator or be automatic.
Multiple cameras can be used to capture the primary image 14 and halo image 26 from different perspectives. As illustrated in figure 2, cameras 12, 12a, 12b are used to capture respective primary images 14 and cameras 24 and 24a are used to capture respective halo images 26. It should be noted that each camera can have its own halo image and bias zone, and as such the number of halo sizes at any one time is only limited by the number of cameras. Accordingly, this gives the operator greater flexibility in selecting a suitable image for broadcasting.
As illustrated in figure 6, the apparatus 10 can be used to provide footage of a soccer game being played on a soccer field 34. The present example includes plays 94 and 96 that will be used to illustrate to relationship between the primary and halo images 14 and 26. The first play 94 starts at the kickoff from the centre circle, when the ball is located on the centre spot. The primary image 14 is positioned at a centre point of the halo image 26, as illustrated by event 98. This means that all players within the vicinity will be included in the halo image 26. As play progress and player 64 runs down the field, as illustrated by event 100 the primary image 14 is positioned towards the trailing edge of the halo image 26. This means that the halo image extends forward of the player 64 even when the player changes direction as illustrated by event 102. When the ball passes over the boundary line 36, as illustrated by event 104, the halo image 26 is inhibited from extending beyond the bump bar 86.
In the second play 96 a corner is taken, as illustrated by event 106, wherein the halo image 26 is enlarged to capture a larger portion of the playing field.
Although not illustrated the reader should appreciate that the halo image 26 could be large enough to capture the players in front of the goal 84. The ball is then kicked to centre and directed into the goal 84 as illustrated by event 108. As the ball changes direction the halo image 26 captured by camera 24 also changes orientation to include the goal and goalie.
The skilled addressee will now appreciate the many advantages of the illustrated invention. In one form the invention provides an apparatus and method of controlling a plurality of cameras to capture footage of a sporting or stage event. The use of at least a first camera that captures a primary image that conforms to the target object and a halo image captured by a second camera having a wider field of view means that a single operator can simply and effectively control the composition of the television broadcast. The use of a central control unit enable the operator to control a number of cameras by simply passing a stylus over the surface of a touch screen displaying live footage of the sporting arena.
Various features of the invention have been particularly shown and described in connection with the exemplified embodiments of the invention, however, it must be understood that these particular arrangements merely illustrate and that the invention is not limited thereto. Accordingly the invention can include various modifications, which fall within the spirit and scope of the invention. It should be further understood that for the purpose of the specification the word "comprise" or "comprising" means "including but not limited to".

Claims

THE CLAIMS DEFINING THE INVENTION ARE AS FOLLOWS:
1. A method of obtaining motion picture footage of a moving object including the steps of:
capturing a dynamic primary image of said object using a first motion picture camera; and
capturing a dynamic halo image that substantially extends around said primary image using a second motion picture camera, wherein said first and second motion picture cameras being controlled such that the position of the halo image relative to the primary image can be altered.
2. The method in accordance with claim 1 wherein the first and second motion picture cameras being controlled such that primary image retains a portion of the halo image and the position of the halo image relative to the primary image can be altered.
3. The method in accordance with claim 1 or 2 wherein the first and second
motion picture cameras, configured to capture the primary and halo images, having respective fields of view and subject distances that can be altered relative to the other motion picture camera or independent thereof.
4. The method in accordance with any one of claims 1 to 3 wherein said object is a dynamic target such as a sports player, ball or stage performer, wherein the primary and halo images include motion picture footage of at least the dynamic target.
5. The method in accordance with claim 4 wherein the primary and halo images further include motion picture footage of an individual or individuals engaged in the sporting contest, goals, wickets, relevant line markings, or stage sets.
6. The method in accordance with any one of claims 1-5 wherein the object may be tracked using a tracking device wherein at least one of said first or second cameras being assigned to follow said object.
7. The method in accordance with claim 6 wherein the tracking device includes an
RF or GPS tag in communication with a control means for controlling the operation of the first and second cameras to capture primary and halo images.
8. The method in accordance with any one of the above claims wherein the dynamic halo image can be coupled to the dynamic primary image, and typically positioned around, such that the movement of the second motion picture camera is depended upon the movement of the first motion picture camera.
9. The method in accordance with any one of claims 1 to 7 wherein the dynamic halo image can be uncoupled from the dynamic primary image such that the first motion picture camera capturing the dynamic primary image may follow the trajectory of a ball and the second motion picture camera capturing the dynamic second halo image may capture footage of the expected landing area that has been calculated from the trajectory of said ball.
10. The method in accordance with any one of the above claims wherein a plurality of halo images may surround the primary image.
11. The method in accordance with any of the above claims wherein, the track nodes x and y location can be determined on a cutting plane which has a prescribed z value, either manually by an operator or by a tracking system.
12. The method in accordance with any of the above claims wherein the height of the primary and halo images from the ground plane of the stage or sporting field can be varied.
13. The method in accordance with any of the above claims wherein the camera's centre of view may have an angular or distance offset relative to the centre of the primary and halo images.
14. The method in accordance with any of the above claims wherein a track node can be assigned to a tracked object, and the height of the track node from the ground plane of the sporting field may be varied.
15. The method in accordance with any of the above claims wherein the position of the track node within a bias zone can affect the spatial relationship between the track node and the surrounding primary and halo images.
16. The method in accordance with any of the above claims wherein the movement of the primary and secondary halo images may be restrained from travelling past designated alignments in both the horizontal and vertical planes.
17. The method in accordance with any one of the above claims wherein the steps of the method are undertaking using designated software and hardware.
18. A motion picture capturing apparatus including, a first camera for capturing a dynamic primary image of a moving object, at least one second camera for capturing a dynamic halo image that substantially extends around the primary image, and a control means for controlling the position of said dynamic halo image around said dynamic primary image.
19. The motion picture capturing apparatus in accordance with claim 18 wherein the first camera and at least one second camera are controlled by servo- assisted pan tilt heads and servo assisted lenses configured to control the focus and zoom and the direction of the first and at least one second cameras, wherein at least the focus, zoom and direction of the cameras can be altered by use of said control means that includes a user interface whereby the position of the halo image relative to the primary image can be altered.
20. The motion picture capturing apparatus in accordance with claim 18 or 19
wherein the user interface includes a touch screen showing at least a motion picture footage and a synchronised model of a defined area, the defined area being selected from a group including a sporting arena, playing field, playing court, stage, room, pitch and oval.
21. The motion picture capturing apparatus in accordance with claim 19 or 20
wherein the primary image can be uncoupled from the halo image, the uncoupling of the halo image from the primary image being undertaken in an automatic mode by way of software when the target object is located within a predetermined space, including a goal square, or a user being able to uncouple the primary image from the halo image by way of the user interface.
PCT/AU2010/000886 2009-08-31 2010-07-13 A method and apparatus for relative control of multiple cameras WO2011022755A1 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
US13/392,515 US20120154593A1 (en) 2009-08-31 2010-07-13 method and apparatus for relative control of multiple cameras
EP10811004.0A EP2474162B8 (en) 2009-08-31 2010-07-13 A method and apparatus for relative control of multiple cameras
CN201080038605.XA CN102598658B (en) 2009-08-31 2010-07-13 The relation control method and apparatus of multiple cameras
AU2010286316A AU2010286316B2 (en) 2009-08-31 2010-07-13 A method and apparatus for relative control of multiple cameras
JP2012525813A JP5806215B2 (en) 2009-08-31 2010-07-13 Method and apparatus for relative control of multiple cameras

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
AU2009904169A AU2009904169A0 (en) 2009-08-31 A method and apparatus for relative control of multiple cameras
AU2009904169 2009-08-31

Publications (1)

Publication Number Publication Date
WO2011022755A1 true WO2011022755A1 (en) 2011-03-03

Family

ID=43627063

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/AU2010/000886 WO2011022755A1 (en) 2009-08-31 2010-07-13 A method and apparatus for relative control of multiple cameras

Country Status (6)

Country Link
US (1) US20120154593A1 (en)
EP (1) EP2474162B8 (en)
JP (1) JP5806215B2 (en)
CN (1) CN102598658B (en)
AU (1) AU2010286316B2 (en)
WO (1) WO2011022755A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2497119A (en) * 2011-12-01 2013-06-05 Sony Corp Mapping scene geometry from wide field of view image onto narrow field of view image

Families Citing this family (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9526156B2 (en) * 2010-05-18 2016-12-20 Disney Enterprises, Inc. System and method for theatrical followspot control interface
US20120087588A1 (en) * 2010-10-08 2012-04-12 Gerald Carter System and method for customized viewing of visual media
EP2892228A1 (en) 2011-08-05 2015-07-08 Fox Sports Productions, Inc. Selective capture and presentation of native image portions
US11039109B2 (en) 2011-08-05 2021-06-15 Fox Sports Productions, Llc System and method for adjusting an image for a vehicle mounted camera
US10469790B2 (en) * 2011-08-31 2019-11-05 Cablecam, Llc Control system and method for an aerially moved payload system
BR112015004087A2 (en) * 2012-08-31 2017-07-04 Fox Sports Productions Inc method for tracking and marking objects of interest in a transmission; Selective view user interface captures or reproduces multiple cameras at an event; and selective view, capture or playback system of multiple cameras in one event
KR101970197B1 (en) * 2012-10-29 2019-04-18 에스케이 텔레콤주식회사 Method for Controlling Multiple Camera, Apparatus therefor
JP6551392B2 (en) * 2013-04-05 2019-07-31 アンドラ モーション テクノロジーズ インク. System and method for controlling an apparatus for image capture
US9754373B2 (en) * 2013-11-25 2017-09-05 Gregory J. Seita Methods and apparatus for automated bocce measurement and scoring
JP2016046642A (en) * 2014-08-21 2016-04-04 キヤノン株式会社 Information processing system, information processing method, and program
JP6452386B2 (en) * 2014-10-29 2019-01-16 キヤノン株式会社 Imaging apparatus, imaging system, and imaging apparatus control method
US11758238B2 (en) 2014-12-13 2023-09-12 Fox Sports Productions, Llc Systems and methods for displaying wind characteristics and effects within a broadcast
US11159854B2 (en) 2014-12-13 2021-10-26 Fox Sports Productions, Llc Systems and methods for tracking and tagging objects within a broadcast
KR102101438B1 (en) 2015-01-29 2020-04-20 한국전자통신연구원 Multiple camera control apparatus and method for maintaining the position and size of the object in continuous service switching point
CN105072384A (en) * 2015-07-23 2015-11-18 柳州正高科技有限公司 Method for obtaining football moving images
US10003786B2 (en) * 2015-09-25 2018-06-19 Intel Corporation Method and system of 3D image capture with dynamic cameras
CN105488457B (en) * 2015-11-23 2019-04-16 北京电影学院 Dummy emulation method and system of the camera motion control system in film shooting
US10143907B2 (en) * 2015-12-09 2018-12-04 Gregoire Gentil Planar solutions to object-tracking problems
US10471304B2 (en) 2016-03-08 2019-11-12 Sportsmedia Technology Corporation Systems and methods for integrated automated sports data collection and analytics platform
JP6922369B2 (en) * 2017-04-14 2021-08-18 富士通株式会社 Viewpoint selection support program, viewpoint selection support method and viewpoint selection support device
US10198843B1 (en) * 2017-07-21 2019-02-05 Accenture Global Solutions Limited Conversion of 2D diagrams to 3D rich immersive content
WO2019025833A1 (en) * 2017-08-02 2019-02-07 Playgineering Systems, Sia A system and a method for automated filming
JP7246005B2 (en) * 2017-10-05 2023-03-27 パナソニックIpマネジメント株式会社 Mobile tracking device and mobile tracking method
JP2019102907A (en) * 2017-11-30 2019-06-24 キヤノン株式会社 Setting device, setting method, and program
US10735826B2 (en) * 2017-12-20 2020-08-04 Intel Corporation Free dimension format and codec
US10832055B2 (en) * 2018-01-31 2020-11-10 Sportsmedia Technology Corporation Systems and methods for providing video presentation and video analytics for live sporting events
JP7366594B2 (en) * 2018-07-31 2023-10-23 キヤノン株式会社 Information processing equipment and its control method
CN110213611A (en) * 2019-06-25 2019-09-06 宫珉 A kind of ball competition field camera shooting implementation method based on artificial intelligence Visual identification technology
KR102112517B1 (en) * 2020-03-06 2020-06-05 모바일센 주식회사 Unmanned sports relay service method through real time video analysis and video editing and apparatus for same
US11653111B2 (en) 2021-03-31 2023-05-16 Apple Inc. Exposure truncation for image sensors
CN113329169B (en) * 2021-04-12 2022-11-22 浙江大华技术股份有限公司 Imaging method, imaging control apparatus, and computer-readable storage medium
US11750922B2 (en) 2021-09-13 2023-09-05 Apple Inc. Camera switchover control techniques for multiple-camera systems
US12015845B2 (en) 2021-09-13 2024-06-18 Apple Inc. Object depth estimation and camera focusing techniques for multiple-camera systems
WO2024069788A1 (en) * 2022-09-28 2024-04-04 株式会社RedDotDroneJapan Mobile body system, aerial photography system, aerial photography method, and aerial photography program

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5953056A (en) 1996-12-20 1999-09-14 Whack & Track, Inc. System and method for enhancing display of a sporting event
WO2006024078A1 (en) * 2004-08-30 2006-03-09 Trace Optic Technologies Pty Ltd A method and apparatus of camera control
US20070058839A1 (en) 2003-05-01 2007-03-15 Jody Echegaray System and method for capturing facial and body motion
WO2007133982A2 (en) * 2006-05-08 2007-11-22 John-Paul Cana Multi-axis control of a device based on the wireless tracking location of a target device

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5164827A (en) * 1991-08-22 1992-11-17 Sensormatic Electronics Corporation Surveillance system with master camera control of slave cameras
US5363297A (en) * 1992-06-05 1994-11-08 Larson Noble G Automated camera-based tracking system for sports contests
JP3365182B2 (en) * 1995-12-27 2003-01-08 三菱電機株式会社 Video surveillance equipment
JP2791310B2 (en) * 1996-08-27 1998-08-27 幹次 村上 Imaging device for multi-angle shooting
US6567116B1 (en) * 1998-11-20 2003-05-20 James A. Aman Multiple object tracking system
US20030210329A1 (en) * 2001-11-08 2003-11-13 Aagaard Kenneth Joseph Video system and methods for operating a video system
US7218320B2 (en) * 2003-03-13 2007-05-15 Sony Corporation System and method for capturing facial and body motion
JP4314929B2 (en) * 2003-08-22 2009-08-19 パナソニック株式会社 Motion detection device
JP2006261999A (en) * 2005-03-16 2006-09-28 Olympus Corp Camera, camera system, and cooperative photographing method using multiple cameras
US20080129844A1 (en) * 2006-10-27 2008-06-05 Cusack Francis J Apparatus for image capture with automatic and manual field of interest processing with a multi-resolution camera
NZ598897A (en) * 2006-12-04 2013-09-27 Lynx System Developers Inc Autonomous systems and methods for still and moving picture production
US9185361B2 (en) * 2008-07-29 2015-11-10 Gerald Curry Camera-based tracking and position determination for sporting events using event information and intelligence data extracted in real-time from position information

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5953056A (en) 1996-12-20 1999-09-14 Whack & Track, Inc. System and method for enhancing display of a sporting event
US20070058839A1 (en) 2003-05-01 2007-03-15 Jody Echegaray System and method for capturing facial and body motion
WO2006024078A1 (en) * 2004-08-30 2006-03-09 Trace Optic Technologies Pty Ltd A method and apparatus of camera control
WO2007133982A2 (en) * 2006-05-08 2007-11-22 John-Paul Cana Multi-axis control of a device based on the wireless tracking location of a target device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2474162A4

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2497119A (en) * 2011-12-01 2013-06-05 Sony Corp Mapping scene geometry from wide field of view image onto narrow field of view image
GB2497119B (en) * 2011-12-01 2013-12-25 Sony Corp Image processing system and method
US9325934B2 (en) 2011-12-01 2016-04-26 Sony Corporation Image processing system and method

Also Published As

Publication number Publication date
CN102598658B (en) 2016-03-16
EP2474162A1 (en) 2012-07-11
EP2474162B1 (en) 2019-07-03
AU2010286316A1 (en) 2012-04-19
US20120154593A1 (en) 2012-06-21
EP2474162B8 (en) 2019-08-14
CN102598658A (en) 2012-07-18
JP5806215B2 (en) 2015-11-10
EP2474162A4 (en) 2015-04-08
AU2010286316B2 (en) 2016-05-19
JP2013503504A (en) 2013-01-31

Similar Documents

Publication Publication Date Title
AU2010286316B2 (en) A method and apparatus for relative control of multiple cameras
US9813610B2 (en) Method and apparatus for relative control of multiple cameras using at least one bias zone
JP6719465B2 (en) System and method for displaying wind characteristics and effects in broadcast
US9298986B2 (en) Systems and methods for video processing
EP2277305B1 (en) Method and apparatus for camera control and picture composition
US7193645B1 (en) Video system and method of operating a video system
KR102189139B1 (en) A Method and System for Producing a Video Production
CN113873174A (en) Method and system for automatic television production
US9736462B2 (en) Three-dimensional video production system
Cavallaro et al. Augmenting live broadcast sports with 3D tracking information
US8957969B2 (en) Method and apparatus for camera control and picture composition using at least two biasing means
JPH06105231A (en) Picture synthesis device
WO2018004354A1 (en) Camera system for filming sports venues
EP3836081A1 (en) Data processing method and apparatus
GB2559003A (en) Automatic camera control system for tennis and sports with multiple areas of interest
CA2559783A1 (en) A system and method for graphically enhancing the visibility of an object/person in broadcasting
WO2016032427A1 (en) Three-dimensional video production system

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 201080038605.X

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10811004

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 13392515

Country of ref document: US

WWE Wipo information: entry into national phase

Ref document number: 2012525813

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2010286316

Country of ref document: AU

Ref document number: 2010811004

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2010286316

Country of ref document: AU

Date of ref document: 20100713

Kind code of ref document: A