WO2011022755A1 - A method and apparatus for relative control of multiple cameras - Google Patents
A method and apparatus for relative control of multiple cameras Download PDFInfo
- Publication number
- WO2011022755A1 WO2011022755A1 PCT/AU2010/000886 AU2010000886W WO2011022755A1 WO 2011022755 A1 WO2011022755 A1 WO 2011022755A1 AU 2010000886 W AU2010000886 W AU 2010000886W WO 2011022755 A1 WO2011022755 A1 WO 2011022755A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- halo
- primary
- camera
- motion picture
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 33
- 125000001475 halogen functional group Chemical group 0.000 claims abstract description 118
- 230000033001 locomotion Effects 0.000 claims abstract description 50
- 238000005520 cutting process Methods 0.000 claims description 24
- 238000004891 communication Methods 0.000 claims description 7
- 230000001360 synchronised effect Effects 0.000 claims description 4
- 230000006870 function Effects 0.000 description 30
- 238000009432 framing Methods 0.000 description 10
- 230000000007 visual effect Effects 0.000 description 10
- 230000000694 effects Effects 0.000 description 9
- 230000003993 interaction Effects 0.000 description 9
- 241001422033 Thestylus Species 0.000 description 7
- 230000004888 barrier function Effects 0.000 description 5
- 230000006399 behavior Effects 0.000 description 5
- 230000009471 action Effects 0.000 description 4
- 238000012935 Averaging Methods 0.000 description 2
- 238000005094 computer simulation Methods 0.000 description 2
- 230000001419 dependent effect Effects 0.000 description 2
- 238000013178 mathematical model Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000007935 neutral effect Effects 0.000 description 2
- 208000031427 Foetal heart rate deceleration Diseases 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000000881 depressing effect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000004091 panning Methods 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000003860 storage Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000009012 visual motion Effects 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
- H04N7/185—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/40—Scenes; Scene-specific elements in video content
- G06V20/41—Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items
- G06V20/42—Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items of sport video content
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/62—Control of parameters via user interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/695—Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
Definitions
- the present invention relates generally to the field of camera control systems and in one aspect relates to the control of at least two cameras for capturing different images of an object moving across a surface wherein a primary image is contained, and movable within, a halo image, the position of the halo image being dependent upon the movement of the object.
- Televised sporting events are extremely popular on both free-to-air and pay television, with many channels being solely dedicated to sport. With the advent of more advanced camera technology, quality has increased and new camera shots have been achieved. Cameras located in cricket stumps and inside race cars are now common.
- the present invention provides an alternative where by the cameras can be controlled automatically using servos and encoders enabling an autofocus, and auto zoom, auto pan and tilt.
- This system enables the camera to receives control signals from a control means to facilitate the capturing of imagery of the game.
- the cost of placing a skilled camera operator behind each camera is one of the limitations of the manually controlled systems.
- the placement of cameras around the perimeter of the playing field is restricted.
- a further limitation of a manually controlled system is that camera operators can obscure the action of the sport or stage productions, when close ups are needed as is the case with boxing and ice hockey.
- first and second motion picture cameras being controlled such that the position of the halo image relative to the primary image can be altered.
- the first and second motion picture cameras being controlled such that primary image retains a portion of the halo image and the position of the halo image relative to the primary image can be altered.
- the object may be a ball being used in a sporting contest, wherein the primary and halo images include motion picture footage of at least the ball.
- the primary and halo images may further include motion picture footage of an individual or individuals engaged in the sporting contest, goals, wickets or relevant line markings.
- the quality and framing of the dynamic primary image is defined by the field of view (zoom) and subject distance (focus) of a lens of said first camera and the camera's alignment on the servo pan tilt head.
- the quality and framing of the dynamic halo image is defined by the field of view and subject distance of a lens of the second camera and the camera's alignment on the servo pan tilt head.
- the shape of the primary image and halo image can be, but are not limited to circles, ovals, squares and rectangles.
- the primary image and halo image defined by respective field of views and subject distances, can be altered. This is important because the composition of camera footage that is the most desirable for a viewer will vary depending upon the behaviour of the player or players engaged in play. In this way close up footage of the object, such as a particular sports player can be captured with one camera whilst automatically capturing with secondary cameras the wider area around the player that may include opposing players that may contests for the ball, or team mates to which the ball may be passed.
- the object being tracked is a ball being used to play a sport such as soccer or basketball and the motion picture primary image and halo images move thereby include the ball and individual or individuals engaged in play or other images of audience interest.
- the term play referred to the progress of the game in which the individual player or players are actively engaged in.
- the halo image may be positioned forwardly of one side around the primary image, wherein the halo image extends forward of the player and includes defending players that are in close proximity to the first player and that may engage them in play within a short period of time.
- the method may use at least one primary image contained within at least one halo image.
- an operator may use the halo image or multiple halo images.
- the primary and halo images may be locked onto a predefined object, including an RF tag or movable point herein referred to as a track node, which may follow the game ball, player or vehicle.
- a track node refers to a series of points having x, y, z coordinates within a mathematical model that is created by surveying and mapping the surface of a selected area.
- the track node may replicate, within the
- the size of the primary and halo images can be individually adjusted.
- the images' size can also be set as either a percentage of the primary image, or as an adjustable fixed size, or as a variable logarithmic percentage of the primary image.
- the size of a halo may also be determined via the position of the track node within a bias zone.
- the bias zone may have predefined parameters that control the position of the primary and halo images around the tagged object or track node.
- the predefined parameters are preferably stored in software.
- Primary and halo images are preferably controlled by software to facilitate the often complex requirements of correct framing of any given sport or activity.
- the following basic summary alerts the reader to some of the complexities of these interactions.
- the images encircle the tracked object, have offset limit lines that keep the tracked object within specified boundaries. These boundaries can be thought of as a fence that stops the tracked object from exiting.
- the images also have location fields within the limit lines. The location field positions the image around the tracked object depending on the tracked objects position within the bias zone which typically covers the entire playing arena and the direction of travel which is an Operator adjusted function.
- the space where images can be moved is also restricted by the bump bars, which are typically located just outside the boundary of the playing field or performance space.
- the images may have limit lines, which are parallel line to the image's external edge that can be offset at specified distances or at a percentage of the image's diameter or longest side. Images are designed to capture the tracked object or track node within the image's limit lines.
- the limit lines effectively give the object or player being framed some space around them before the edge of the television picture frame.
- the limit lines also have a variable cushioning effect that enable the track node to have a range of hard to soft collisions with the limit line. This cushioning effect enables a smoother visual motion picture without jerky changes in direction.
- the limit lines can be outside the image, thereby enabling the track node to be captured but still outside the image.
- the limit lines can be offset from the outside edge of the image, and the methods of offset include, a specified distance, specified percentage of the diameter or diagonal, and a combination of both percentage and specified minimum and maximum distances.
- the relationship between the primary and halo images is relative to, and controlled by, a control means.
- the size of the primary image may be proportional, to the halo image. This proportion relationship may be directly or inversely proportional or be linear or exponential.
- each image has a location field that consist of an x, y and z axis that typically bisects through the centre of the image.
- Location fields have variable patterns, which include but are not limited to, orthogonal patterns with one or two axis, curved grid patterns, parabolic patterns, or concentric circle patterns.
- the track node which is the object being tracked interacts with the following; the location fields, the direction of travel, and the bias zones to enable the correct motion picture framing of the tracked object within the televisions picture frame.
- the location field adjusts the position of the track node along its x axis, proportion to the direction of travel of the track node.
- the location field adjusts the position of the track node along its y axis, proportion to the track node's position within the bias zones. Further information on the methods of interaction between track nodes, location fields, direction of travel and bias zones are contained in subsequent sections.
- the images movement, size, position and relationship with each other, may vary depending on the tracked object's velocity, direction of travel, behaviour, position with the bias zone and relative direction with respect to the physical location of the first or second camera.
- the relationship may also be altered depending upon the character of the object being tracked. For instance where a player is being tracked their movement and behaviour will be restricted to a narrow flat band adjacent a playing surface. In contrast the movement and behaviour of a football being kicked would be quite different and would be within a broader band that extends upwardly from the playing surface. Accordingly the relationship may be altered by the trajectory or expected trajectory of the ball. In such a situation the dynamic primary image may follow the trajectory of the ball whist the dynamic second halo image may capture footage of the expected landing area that has been calculated from the trajectory of the ball. Typically the primary image is positioned within the halo image, although it should be appreciated that the halo image may be separated from the primary image.
- the halo image may be uncoupled from the primary image such that the second camera is directed at the goal when the track node or ball comes into contact with the specified area.
- the uncoupling of the halo image from the primary image may be done automatically by way of computer software when the target object is located within a predetermined space such as the goal square. Alternatively this uncoupling can be performed via the user interface and in one form a switch may be used. The uncoupling of the images or halos may also occur when footage of the crowd, coach's box, or other predetermined areas is required. This uncoupling and repositioning of the second camera may be performed by separate control switches.
- multiple halo images can surround the primary image and each halo image can have its own specified size.
- the capturing of the images is controlled by software that may include, bias zones, bump bars, direction of travel, framing limit lines, split button, and proportional head room framing. Individual halo images may be able to interact with the software while the primary image may not interact. The operator can individually activate or deactivate each image's interaction with the software.
- a motion picture capturing apparatus including, a first camera for capturing a dynamic primary image of a moving object, a second camera for capturing a dynamic halo image that extends around the primary image, and a control means for controlling the position of said dynamic halo image around the said dynamic primary image.
- the first camera and all secondary cameras are controlled by servo-assisted pan tilt heads and servo assisted lenses that control the focus and zoom.
- the control means further controls the pan, tilt, zoom and focus of the respective first and all secondary cameras.
- the relationship between the primary image and all halo images may be altered by use of the control means that may include a user interface and designated software.
- This user interface may include a touch screen, which shows live video and a synchronised 3D model of the playing area.
- the control means may require the synchronisation of the virtual 3D computer generated environment with a camera's real world view of the same environment. This synchronisation enables the operator to see the overlayed 3D model, such as a soccer field line markings, over the video. This enables the operator to working in the 3D model computer world while still seeing what is happening via the video.
- This synchronisation typically requires: the calibration and charting of the servo encoded lens's zoom and focus; a 3D model of the environment created either by surveying the environment or by having a knowing standard environment such as tennis court; the cameras having known 3D locations with associated x, y, z coordinates and the pitch and yaw of the horizontal plane of the camera head is also known; and each camera being mounted onto a servo encoded pan tilt head.
- This synchronisation enables a computer to determine the camera's field of view via the encoder's reading of the pan, tilt, zoom and focus settings. As a result the operator sees an accurate virtual 3D model superimposed over the real world video. Thus when a camera's field of view moves, then the synchronised 3D model also precisely moves in real time.
- This synchronisation now enables one human operator to accurately command and control in real time multiple cameras around a designated area and see the camera vision and the superimposed 3D geometric and spatial software functions working. This can enable far superior accuracy of framing and focusing on dynamic targets.
- control means further includes a broadcast switching device to enable the operator to select the footage that is to be broadcast or recorded.
- the component of the apparatus, such as the cameras, display means and control means may be connected by way of a communication means such as, but not limited to, a modem communication path, a computer network such as a local area network
- LAN local area network
- Internet wide area network
- RF radio frequency
- the processor and the memory cooperate with each other and with other components of a computer to perform all of the functionality described herein.
- the processor executes appropriate software to perform all of the functionality described herein.
- the control means is a computer including RAM and ROM memory, a central processing unit or units, input/output (10) interfaces and at least one data storage device.
- the computer includes application software for controlling the cameras and performing functions, stored in a computer readable medium on a storage device.
- the apparatus may include embedded software or firmware with corresponding hardware that is designed to perform one or more dedicated functions.
- the processor and the memory cooperate with each other and with other components of a computer to perform all of the functionality described herein.
- control means includes a computer monitor with a virtual model or map of the playing surface which overlays in real time over the
- the virtual model may include such things as the boundaries of the playing surface, goals and relevant line markings. It is within the computer model that the operator can command and control and see the various geometric and spatial software functions working over the camera's video.
- a motion picture capturing apparatus including, a first camera for capturing a dynamic primary image of a moving object, at least one second camera for capturing a dynamic halo image that substantially extends around the primary image, and a control means for controlling the position of said dynamic images.
- a track node may be stored within software to facilitate the positioning of the said primary and halo images.
- Track nodes are mathematical points that can be assigns to track vehicles, players or the match ball to give them a positional reference.
- the real time position of the track node is governed by, but not limited to GPS devices, RF tagging devices, optical recognition devices, and manual tracking using either a mouse or a stylist on a touch screen.
- Images can be individually assigned to specified track nodes.
- Track node can spatially interact with the images in a variety of ways.
- a track node may be locked onto the cutting plane there by setting the height of the track node away from the playing surface, while allowing the track node to travel across the cutting plane in any direction, speed and acceleration.
- the track node can also be offset from the cutting plane in a variety of methods that include but not limited to, a wheel on a mouse, a wheel within a control interface, and depressing a button and using a touch screen stylist to move the stylist either up or down the touch screen.
- the computer uses the position of the track node to calculate the subject distance for the lenses' focus settings, thereby enabling the area around the track node to always be in focus.
- the subject distance is the distance from the lens to the subject or tracked target.
- Multiple track nodes can be utilised where there are multiple targets requiring tracking.
- Nominated cameras can be exclusively assigned to specified track nodes while interacting with the software devises.
- the cutting plane enables the images to have the z-axis position as the cutting planes surface.
- the cutting plane is a mathematical plane contained within software that is offset from the playing surface at variable heights.
- the plane can be parallel to a designated surface, or it can be a curved or variable surface over the playing field or surface.
- the cutting plane can also be shaped into any profile such as a plane that is offset 1 meter and parallel to a complex and undulating motor racing track.
- cutting planes will extend well beyond the primary playing area into secondary areas, such as the surrounding playing areas, grand stands and vehicular run off areas.
- the primary function of the cutting plane is to allow the track nodes, and thereby the captured images to travel across the cutting plane's surface or be offset from it.
- the cutting plane enables better accuracy when tracking motor vehicles because the vehicles height from the racing track is always known (unless the vehicle is flying), therefore GPS tracking inaccuracies in the Z direction or height can be removed.
- a bias zone contained within the software interacts with the track node's position within the bias zone to dictate how the images are positioned around the track node.
- Bias zones have variable patterns that include but are not limited to: orthogonal patterns with one or two axis, or concentric circle or oval patterns.
- the track node may travel either side of the bias zones' x axis and the further the track node is away from the x axis, then the further away the track node is from the image's x axis while still staying within the image's limit line.
- Multiple bias zones may also be utilised, for example an orthogonal bias zone covering an entire soccer field and two concentric circle bias zones each with a 30m radius centred on each goal.
- the resultant effect on the halo images around the track node is based on the averaging of the two bias zones effect, which of course is dependant on the track nodes position with the bias zones.
- a direction of travel function may be stored within the software and in one form may be manually controlled via an adjustable slide device which as a neutral middle position and variable forward and back calibrations.
- the direction of travel creates leading space forward or behind the track node within the images.
- 90% forward on the slide results in the track node being located 90% back from of the images centre, there generating a very large leading space within the halo image in front of the track node.
- the magnitude of the leading space or distance between the track node's position and the offset from the image location field's y axis is proportional to the magnitude of the direction of travel. Which side of the images that the leading space occurs, is governed by the operator and is typically dependant on which way the ball is going.
- a bump bar function may be stored within the software.
- Bump bars are a software spatial ordering function that enable the images to bump into them, but generally do not let the images pass over their geometric alignment. Bump bars are like a fence that can be aligned where required, to frame the perimeter of the playing field. Bump bars have a variable deceleration setting that enables the halo images to cushion into the bump bars before contact occurs.
- the images have 3 optional functionalities that enable them to, firstly, recognise bump bars and cushion into them, secondly to ignore the bump bars and their associated functions, and thirdly a hybrid option where the halo images use the bump bars until the primary halo crosses the bump bar at which point the halo image will continue to surround the primary image as both images cross over the bump bars.
- the bump bars stop the specified images from departing the area of the playing field, thereby keeping the cameras field of view on the playing surface and on the players.
- a picture frame function may be stored within the software.
- the picture frame is a software ordering function that graphically shows the camera's "16 x 9 picture plane" around the captured image.
- the sides of the picture frame always touch the images' external edges relative to the viewing alignment of the camera. As such if the image expands then the picture frame expands.
- the sill and head heights of the picture frame and the centre of the picture frame can be set in a variety of methods. Firstly, the bottom alignment of the picture frame or sill can have an vertical offset distance from either the cutting plane or track surface at the track node's location, secondly the picture frame can be set so that a specified horizontal axis or band of the picture frame always retains the track node on it while the picture frame holds the entire captured image, and thirdly the side of the captured image closest to the camera will rest on the picture frame's sill.
- An additional over riding function on the height of the picture frame head height is the proportional head room function which interacts with the size of the images and the height of the cutting plane so that when the picture frame's top alignment has reached a certain specified height above the playing surface, then the picture frame's height will not drop any further and if the picture frame needs to reduce in size because of a contracting image size, then the picture frame's bottom alignment or sill will rise allowing for the picture frame to shrink in size.
- This proportional framing function can also be used in an inverse fashion, so that the operator can zoom in on the player's feet in a similar manner.
- Picture frames and the visual limit plane have a geometric relationship that stop the picture frame from passing across a visual limit plane.
- a visual limit plane function may be stored within the software.
- a visual limit plane is of any size and shape that can be positioned at any horizontal, vertical or angular alignment.
- the visual limit plane is a spatial software function that enables the camera's view to be restricted from looking past a specified alignment or plane.
- the visual limit plane affect the camera's zoom, pan and tilt. In a typical sporting application like soccer, the visual limit plane will be located just under the roof line of the stadium, and when the wide field of view camera and its associated wide image are tracking a player on the far side of the field then the head of the picture frame would contact the visual limit plane and stop the camera's field from seeing under the stadium roof and push the camera's field of view further onto the playing field where the action is.
- Visual limit planes can be set individually for each camera and are particularly useful when located just under the roof of stadiums, stage boundaries, or edges of unsightly structure.
- the operator can set the visual limit planes and bump bars in appropriate positions within the 3D model which is superimposed over the real time video and examine all camera views for functionality and aesthetic composition.
- a split button function may be stored within the software and enables the operator to push a button, there by releasing the specified images from the cutting plane to follow a target such as a basketball through a' path of travel.
- the system recognises the track node's location and draws a base line from that point to the designated target point which can be the centre of the basketball or netball hoop.
- an image tally light function may be stored within the software.
- the image tally light may highlight the live feed camera's halo or picture frame.
- a vista line function may be stored within the software and creates a series of lines within the virtual 3D computer model that start at a camera location and extend to the tangent points on both sides of that camera's images. The lines may be terminated at either the image's tangents, or cutting plane ,or designated distance past the image. Similarly the centre vista line starts at the camera location and extends to the track node and may terminate as at the track node, or cutting plane, or designated distance.
- a hierarchy of commands function may be stored within the software. Many of the aforementioned functions interrelate with each other and in some circumstances may desire to over ride each other. As such a hierarchy of commands is structured within the system requirements, enabling commands to over rule other commands.
- a relative zoom points function may be stored within the software.
- This software function enables a point on the cutting plane to be selected i.e. the soccer goals, and for that point to stay in the same location within the camera's field of view as the operator zooms in or out either by manual controls or in a preset manner.
- This software command can also utilise the camera's picture plane via the systems understanding of the lens's field of view.
- a pan point function may be stored within the software and enables the operator to select two points, a genesis point and a terminus point, where by the designated camera will pan between these points along a designated path.
- This designated path or spline can be adjusted by the operator to form any alignment within a 3D space.
- the zoom setting or key framing at the genesis and terminus points and at any number of points along the spline can be designated so that the lens' zoom extrapolates evenly between them as the camera's centre of view pans along the spline. Time, zoom settings, and speed between the pan points can be specified.
- Figure 1 is a schematic view of a primary image and the surrounding halo
- Figure 2 is a schematic view of a first embodiment of the apparatus for camera control of the present invention
- Figure 3a is a schematic view of the various configurations of the primary image area and surrounding halo image area of figure 1 illustrating the bump bars around the periphery of the playing arena;
- Figure 3b is a schematic view of a primary and halo images and their interaction pattern as they move within the bias zone, showing that the interaction pattern is firstly based upon the position of the track node within the bias zone and secondly the position of the bump bars;
- Figure 3c is a schematic view of a fixed size primary and halo images and their interaction pattern as they move within the circular bias zone;
- Figure 3d is a schematic view of a fixed size primary image and variable size halo image and their interaction pattern as they move within the circular bias zone;
- Figure 3e is a schematic view of a halo and its component parts
- Figure 3f is a schematic view of some of the embodiments of a halo
- Figure 3g is a schematic view of a bias zone and its component parts
- Figure 3h is a schematic view of some of the embodiments of a bias zone
- Figure 4a is a schematic view of the primary image of figure 1 illustrating a first embodiment of the vertical barrier above the playing surface;
- Figure 4b is a schematic view illustrating a second embodiment of the vertical boundary above the playing surface
- Figure 5 is a schematic view illustrating a further embodiment
- Figure 6 is an overhead view of the movement of a player across a playing surface illustrating the position of the images captured by the first and second cameras.
- the motion picture capturing apparatus includes a first camera 12 for capturing a dynamic primary image 14 of an object 16, the primary image 14 being defined by the field of view 18 and subject distance 20 of the lens 22 of the first camera 12.
- the apparatus 10 further including a second camera 24 for capturing a dynamic halo image 26 that contains and extends around the primary image 14, the halo image being defined by the field of view 28 and subject distance 30 of the lens 32 of the second camera 24.
- the dimensions of at least the halo image 26 and the position of the primary image 14 therewithin may be altered depending upon the direction of travel and behaviour of the object 16.
- the apparatus 10 can be used to capture footage of a sporting contest, such as a game of soccer.
- the first and second cameras 12, 24 are placed around a playing surface in this example being a soccer field 34 having a boundary line 36, various field markings 38 and opposing goals 40, 42.
- a third camera 44 is configured to capture an image 46 of the playing field 34.
- Signals are received from and sent to cameras 12, 24 and 44 by way of communication means 48.
- the communication means 48 may be hard wired to the cameras or be connected by way of a transmitted/receiver.
- the communication means 48 is connected to a control means 50, including a touch screen 52, for displaying image 46, and stylus 54, for controlling the images captured by the first and second cameras 12, 24, and a broadcast switcher 56 in communication with a broadcast tower 58 for controlling the television images broadcast.
- the broadcast switcher 56 includes switches 60, 62 for selecting the desired images for broadcasting.
- the object 16 is a soccer player 64 who is kicking a ball 66 down the field 34 in the direction of arrow 68 which indicates the direction of travel.
- the direction of travel is communicated to the apparatus 50 via the joystick 74.
- the image 46 of the field is displayed on the touch screen 52.
- the operator uses the stylus 54 to positions the track node 11 in the centre of the play between the soccer player 64 and the soccer ball 66.
- the size of the images can be controlled via the rotation of the joystick's knob 75.
- the movement of the stylus 54 across the display means 52 generates digital signals representative of the required panning, tilting, focusing and zoom operations of the cameras 12, 24 and their lenses 22, 32 to track an object 16 across surface 34.
- the operator can either select to follow an individual player that is in control of the ball or the ball itself depending upon the required shots and whether the ball is being passed between players.
- the movement of the stylus 54 across the screen 52 results in corresponding movement of cameras 12, 24. It should however be appreciated that the users finger or tracking subsystems could be used instead of the stylus 54 to track movement of the object 16 across the touch screen 52.
- the stylus 54 is used to control the first camera such that the track node 11 of the primary halo corresponds to the position of the stylus 54 on the image 46 displayed on the screen 52. In the present embodiment, the position of the stylus 54 controls the position the halo 26 around the primary image 14.
- the images 14, 26 captured by the first and second cameras 12, 24 are displayed on screens 70, 72.
- the screens 70, 72 are used so that the operator can select the best image for broadcasting.
- the display means 52 may include the images captured by the cameras or the apparatus may include a separate split screen displaying the images captured by the various connected cameras.
- the apparatus 10 utilises a joystick 74 for controlling the direction of travel although in another form this joystick 74 can be used for controlling the position of the images around the track node 11.
- the joystick knob 75 may also be used to control the dimensions of the primary and/or halo images.
- the computer includes application software for controlling the computer, receiving data from the screen 52, stylus 54 and joystick 74.
- the software is configured to generate appropriate signals to control the servo-assisted camera heads and encoded lenses that control pan, tilt, focus and zoom of the cameras 12, 24 depending upon the signals received from the screen 52, stylus 54 and joystick 74.
- Application software may be stored in a computer.
- the lenses 22, 32 are calibrated either by using the manufactures data or by setting up the camera and lens in a known environment and recording the focus and zoom settings at variable distances and variable fields of view. Encoders recognise these focus and zoom settings and this data is stored, alternatively the analogy settings of the lens may be used but will not be as accurate. System algorithms utilise this data to enable automated lens control. Thus focus for each lens is achieved by knowing the distance between the camera location and the track node 11. The lens's zoom is achieved by knowing the size of the halo 14 and the distance between camera 12 and halo 14 then applying the calibrated lenses' algorithms to facilitate the correct field of view (zoom).
- the camera's servo driven pan tilt heads are also encoded thereby enabling the system to recognise, command and control the direction of the camera's alignment.
- the camera control system can be used to record images of various sporting activities.
- the apparatus 10 can be used to capture footage of a basketball game played on a basketball court 76 having court markings 78, a boundary line 80 and opposing hoops 82 and 84.
- the control means 50 includes a virtual map of the surface of the playing surface. This virtual map includes respective court marking, boundary line and position of the basketball hoop.
- the vertical map also includes a virtual barrier or bump bar 86 that constrains the movement of the first and second cameras to thereby control the images 14, 26 that are captured. The reader should appreciate that this prevents unwanted footage being captured such as running tracks around the outside of the playing field or images of the edge of the crowd or empty seats.
- Figure 3b illustrates the variable relationship between primary image 14 and halo image 26 dependent on the position of the track node 11 within the bias zone 6 and the direction of travel 68 which is set at 50% left. The illustration shows that when the direction of travel 68 is 50% left then the track node 11 is + 50 th percentile within the halo image 26 location field's y axis throughout the bias zone 11 , until the halo image 26 collides with the bump bar 100, at which time the halo image 26 stops and the primary image 14 is allowed to slide to the left within the halo image 26.
- Figure 3b also shows that when the track node is on the bias zone's 80% x axis 31 alignment then the secondary image location field has the track node on its' 80% x axis 17a alignment. Similarly when the track node is on the bias zone's -40% x axis alignment then the secondary location field has the track node on its' -40% x axis alignment. And once again when the track node is on the bias zone's -80% x axis alignment then the secondary location field has the track node on its' -80% x axis alignment.
- the centre of the image's X & Y axis is 0% and the image's limit lines 19 are +/- 100%.
- the properties of the bias zone can also be changed, and this includes both linear and logarithmic relationships between bias zones and the track node's position within the location field. Multiple overlapping bias zones can be used together, which enables an averaging of the bias zones effects on the image's position around the track node. This enables the halo cameras to have a particular bias towards a geographical location such as a soccer goal.
- Concentric circle bias zones as in Figure 3c & 3d work in a different manner to those discussed previously. Concentric circle bias zones control the halo image's position around the track node. This is enabled by creating an alignment line 19a between the track node 11 and the centre of the bias zone 6 which is extended at the track node end so as to bisect the primary image, or alternatively the alignment line is extended an addition percentage or offset distance.
- the operator's preset options include: fixing the size of the secondary image as per figure 3c; enabling the size of the secondary image to expand and contract while always keeping the centre of the bias zone and primary image within its limit line as per figure 3d; enabling the primary image to positioned within the secondary image in accordance with typical bias zone methods as per figure 3c; and to have the primary image always tangential to the secondary images' limit line as per figure 3d.
- the method as illustrated in figure 3c & 3d are useful in numerous sporting applications where goals are being used and the television viewer's focus of attention is generally where the game ball is and where the goals are. This would be the case in soccer, netball, ice hockey and basketball. Similarly in cricket, where the entire cricket pitch can be part of the bias zone centre which is always within a cameras halo as is the ball as it is hit around the cricket grounds.
- Figure 3b shows that the track node is central within primary image regardless of the track nodes' direction of travel or the nodes position within the bias zone, although the primary image does have the functionality as the halo image to have the track node offset within itself dependant on direction of travel and the track node's location within the bias zone.
- Primary and halo images can have a preset maximum and minimum size.
- the centre of the image's axis are 0% and the limit line are + /- 100% in all axis. Both a linear and a logarithmic relationship can be used between the direction of travel and the track node's position within the location field.
- the bias zones, images and images location fields may all be
- 3D spatial structures working in similar methodologies as previously described, although have 3D properties.
- Adopted 3D structures may include spheres, cylinders, cones, or rectangular prisms.
- a GPS tag would typically be used to establish real time 3D location of the track node.
- the virtual map of the court 76 stored on the control means 50 is in three dimensions.
- the virtual map includes a cutting plane 92, which is used to control the plane on which the images 14, 26 move.
- the height of the cutting plane 92 can be varied.
- the position of the stylist 54 on the cutting plane typically generates the location of Track Node.
- Figure 4a illustrates an area 94 or image that a number of cameras may be focused on.
- the ball is typically passed at chest height hence the cutting plane is located at chest height as per figure 4a.
- Activity in soccer generally occurs at ground level, hence the cutting plane 92 would be lowered accordingly.
- the virtual map includes barrier 96, which inhibits the vertical movement of the field of view 18 (figure 1) above a certain plane.
- the barrier 92 can be either parallel to the playing surface 76, as illustrated in figure 4a or may take any form or shape, including being sloped upwardly from a mid point of the court to the opposing goals 82, 84, as illustrated in figure 4b.
- the barrier 92 above the playing surface acts like a virtual roof and prevents footage being captured of unwanted detail such as empty spectator stands.
- the subject distance 20 When a target is in correct sharp focus, then the distance between the focal point of the lens and the target is known as the subject distance 20.
- the end point of the subject distance may be coupled to the object 16 or the centre of the halo image 14, 26.
- the plane of the halo image 26 can be offset from the plane of the primary image 14. This action may occur from a bias zone interaction affecting only halo image 26.
- the Image's 26 position enables both the basketball hoop 82 and player 64 to be in shot, and for the focus to be as sharp as possible.
- the primary and halo image may be uncoupled where by one halo image tracks an object such as a ball while the other halo image is trained in a prescribed manner onto the landing zone of the ball which is calculated via the balls trajectory.
- This function can be activated by the operator or be automatic.
- cameras 12, 12a, 12b are used to capture respective primary images 14 and cameras 24 and 24a are used to capture respective halo images 26.
- each camera can have its own halo image and bias zone, and as such the number of halo sizes at any one time is only limited by the number of cameras. Accordingly, this gives the operator greater flexibility in selecting a suitable image for broadcasting.
- the apparatus 10 can be used to provide footage of a soccer game being played on a soccer field 34.
- the present example includes plays 94 and 96 that will be used to illustrate to relationship between the primary and halo images 14 and 26.
- the first play 94 starts at the kickoff from the centre circle, when the ball is located on the centre spot.
- the primary image 14 is positioned at a centre point of the halo image 26, as illustrated by event 98. This means that all players within the vicinity will be included in the halo image 26.
- event 100 As play progress and player 64 runs down the field, as illustrated by event 100 the primary image 14 is positioned towards the trailing edge of the halo image 26. This means that the halo image extends forward of the player 64 even when the player changes direction as illustrated by event 102.
- event 104 the halo image 26 is inhibited from extending beyond the bump bar 86.
- halo image 26 could be large enough to capture the players in front of the goal 84.
- the ball is then kicked to centre and directed into the goal 84 as illustrated by event 108.
- the halo image 26 captured by camera 24 also changes orientation to include the goal and goalie.
- the invention provides an apparatus and method of controlling a plurality of cameras to capture footage of a sporting or stage event.
- the use of at least a first camera that captures a primary image that conforms to the target object and a halo image captured by a second camera having a wider field of view means that a single operator can simply and effectively control the composition of the television broadcast.
- the use of a central control unit enable the operator to control a number of cameras by simply passing a stylus over the surface of a touch screen displaying live footage of the sporting arena.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Computational Linguistics (AREA)
- Software Systems (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Studio Devices (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
Description
Claims
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/392,515 US20120154593A1 (en) | 2009-08-31 | 2010-07-13 | method and apparatus for relative control of multiple cameras |
EP10811004.0A EP2474162B8 (en) | 2009-08-31 | 2010-07-13 | A method and apparatus for relative control of multiple cameras |
CN201080038605.XA CN102598658B (en) | 2009-08-31 | 2010-07-13 | The relation control method and apparatus of multiple cameras |
AU2010286316A AU2010286316B2 (en) | 2009-08-31 | 2010-07-13 | A method and apparatus for relative control of multiple cameras |
JP2012525813A JP5806215B2 (en) | 2009-08-31 | 2010-07-13 | Method and apparatus for relative control of multiple cameras |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU2009904169A AU2009904169A0 (en) | 2009-08-31 | A method and apparatus for relative control of multiple cameras | |
AU2009904169 | 2009-08-31 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011022755A1 true WO2011022755A1 (en) | 2011-03-03 |
Family
ID=43627063
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/AU2010/000886 WO2011022755A1 (en) | 2009-08-31 | 2010-07-13 | A method and apparatus for relative control of multiple cameras |
Country Status (6)
Country | Link |
---|---|
US (1) | US20120154593A1 (en) |
EP (1) | EP2474162B8 (en) |
JP (1) | JP5806215B2 (en) |
CN (1) | CN102598658B (en) |
AU (1) | AU2010286316B2 (en) |
WO (1) | WO2011022755A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2497119A (en) * | 2011-12-01 | 2013-06-05 | Sony Corp | Mapping scene geometry from wide field of view image onto narrow field of view image |
Families Citing this family (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9526156B2 (en) * | 2010-05-18 | 2016-12-20 | Disney Enterprises, Inc. | System and method for theatrical followspot control interface |
US20120087588A1 (en) * | 2010-10-08 | 2012-04-12 | Gerald Carter | System and method for customized viewing of visual media |
EP2892228A1 (en) | 2011-08-05 | 2015-07-08 | Fox Sports Productions, Inc. | Selective capture and presentation of native image portions |
US11039109B2 (en) | 2011-08-05 | 2021-06-15 | Fox Sports Productions, Llc | System and method for adjusting an image for a vehicle mounted camera |
US10469790B2 (en) * | 2011-08-31 | 2019-11-05 | Cablecam, Llc | Control system and method for an aerially moved payload system |
BR112015004087A2 (en) * | 2012-08-31 | 2017-07-04 | Fox Sports Productions Inc | method for tracking and marking objects of interest in a transmission; Selective view user interface captures or reproduces multiple cameras at an event; and selective view, capture or playback system of multiple cameras in one event |
KR101970197B1 (en) * | 2012-10-29 | 2019-04-18 | 에스케이 텔레콤주식회사 | Method for Controlling Multiple Camera, Apparatus therefor |
JP6551392B2 (en) * | 2013-04-05 | 2019-07-31 | アンドラ モーション テクノロジーズ インク. | System and method for controlling an apparatus for image capture |
US9754373B2 (en) * | 2013-11-25 | 2017-09-05 | Gregory J. Seita | Methods and apparatus for automated bocce measurement and scoring |
JP2016046642A (en) * | 2014-08-21 | 2016-04-04 | キヤノン株式会社 | Information processing system, information processing method, and program |
JP6452386B2 (en) * | 2014-10-29 | 2019-01-16 | キヤノン株式会社 | Imaging apparatus, imaging system, and imaging apparatus control method |
US11758238B2 (en) | 2014-12-13 | 2023-09-12 | Fox Sports Productions, Llc | Systems and methods for displaying wind characteristics and effects within a broadcast |
US11159854B2 (en) | 2014-12-13 | 2021-10-26 | Fox Sports Productions, Llc | Systems and methods for tracking and tagging objects within a broadcast |
KR102101438B1 (en) | 2015-01-29 | 2020-04-20 | 한국전자통신연구원 | Multiple camera control apparatus and method for maintaining the position and size of the object in continuous service switching point |
CN105072384A (en) * | 2015-07-23 | 2015-11-18 | 柳州正高科技有限公司 | Method for obtaining football moving images |
US10003786B2 (en) * | 2015-09-25 | 2018-06-19 | Intel Corporation | Method and system of 3D image capture with dynamic cameras |
CN105488457B (en) * | 2015-11-23 | 2019-04-16 | 北京电影学院 | Dummy emulation method and system of the camera motion control system in film shooting |
US10143907B2 (en) * | 2015-12-09 | 2018-12-04 | Gregoire Gentil | Planar solutions to object-tracking problems |
US10471304B2 (en) | 2016-03-08 | 2019-11-12 | Sportsmedia Technology Corporation | Systems and methods for integrated automated sports data collection and analytics platform |
JP6922369B2 (en) * | 2017-04-14 | 2021-08-18 | 富士通株式会社 | Viewpoint selection support program, viewpoint selection support method and viewpoint selection support device |
US10198843B1 (en) * | 2017-07-21 | 2019-02-05 | Accenture Global Solutions Limited | Conversion of 2D diagrams to 3D rich immersive content |
WO2019025833A1 (en) * | 2017-08-02 | 2019-02-07 | Playgineering Systems, Sia | A system and a method for automated filming |
JP7246005B2 (en) * | 2017-10-05 | 2023-03-27 | パナソニックIpマネジメント株式会社 | Mobile tracking device and mobile tracking method |
JP2019102907A (en) * | 2017-11-30 | 2019-06-24 | キヤノン株式会社 | Setting device, setting method, and program |
US10735826B2 (en) * | 2017-12-20 | 2020-08-04 | Intel Corporation | Free dimension format and codec |
US10832055B2 (en) * | 2018-01-31 | 2020-11-10 | Sportsmedia Technology Corporation | Systems and methods for providing video presentation and video analytics for live sporting events |
JP7366594B2 (en) * | 2018-07-31 | 2023-10-23 | キヤノン株式会社 | Information processing equipment and its control method |
CN110213611A (en) * | 2019-06-25 | 2019-09-06 | 宫珉 | A kind of ball competition field camera shooting implementation method based on artificial intelligence Visual identification technology |
KR102112517B1 (en) * | 2020-03-06 | 2020-06-05 | 모바일센 주식회사 | Unmanned sports relay service method through real time video analysis and video editing and apparatus for same |
US11653111B2 (en) | 2021-03-31 | 2023-05-16 | Apple Inc. | Exposure truncation for image sensors |
CN113329169B (en) * | 2021-04-12 | 2022-11-22 | 浙江大华技术股份有限公司 | Imaging method, imaging control apparatus, and computer-readable storage medium |
US11750922B2 (en) | 2021-09-13 | 2023-09-05 | Apple Inc. | Camera switchover control techniques for multiple-camera systems |
US12015845B2 (en) | 2021-09-13 | 2024-06-18 | Apple Inc. | Object depth estimation and camera focusing techniques for multiple-camera systems |
WO2024069788A1 (en) * | 2022-09-28 | 2024-04-04 | 株式会社RedDotDroneJapan | Mobile body system, aerial photography system, aerial photography method, and aerial photography program |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5953056A (en) | 1996-12-20 | 1999-09-14 | Whack & Track, Inc. | System and method for enhancing display of a sporting event |
WO2006024078A1 (en) * | 2004-08-30 | 2006-03-09 | Trace Optic Technologies Pty Ltd | A method and apparatus of camera control |
US20070058839A1 (en) | 2003-05-01 | 2007-03-15 | Jody Echegaray | System and method for capturing facial and body motion |
WO2007133982A2 (en) * | 2006-05-08 | 2007-11-22 | John-Paul Cana | Multi-axis control of a device based on the wireless tracking location of a target device |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5164827A (en) * | 1991-08-22 | 1992-11-17 | Sensormatic Electronics Corporation | Surveillance system with master camera control of slave cameras |
US5363297A (en) * | 1992-06-05 | 1994-11-08 | Larson Noble G | Automated camera-based tracking system for sports contests |
JP3365182B2 (en) * | 1995-12-27 | 2003-01-08 | 三菱電機株式会社 | Video surveillance equipment |
JP2791310B2 (en) * | 1996-08-27 | 1998-08-27 | 幹次 村上 | Imaging device for multi-angle shooting |
US6567116B1 (en) * | 1998-11-20 | 2003-05-20 | James A. Aman | Multiple object tracking system |
US20030210329A1 (en) * | 2001-11-08 | 2003-11-13 | Aagaard Kenneth Joseph | Video system and methods for operating a video system |
US7218320B2 (en) * | 2003-03-13 | 2007-05-15 | Sony Corporation | System and method for capturing facial and body motion |
JP4314929B2 (en) * | 2003-08-22 | 2009-08-19 | パナソニック株式会社 | Motion detection device |
JP2006261999A (en) * | 2005-03-16 | 2006-09-28 | Olympus Corp | Camera, camera system, and cooperative photographing method using multiple cameras |
US20080129844A1 (en) * | 2006-10-27 | 2008-06-05 | Cusack Francis J | Apparatus for image capture with automatic and manual field of interest processing with a multi-resolution camera |
NZ598897A (en) * | 2006-12-04 | 2013-09-27 | Lynx System Developers Inc | Autonomous systems and methods for still and moving picture production |
US9185361B2 (en) * | 2008-07-29 | 2015-11-10 | Gerald Curry | Camera-based tracking and position determination for sporting events using event information and intelligence data extracted in real-time from position information |
-
2010
- 2010-07-13 CN CN201080038605.XA patent/CN102598658B/en active Active
- 2010-07-13 WO PCT/AU2010/000886 patent/WO2011022755A1/en active Application Filing
- 2010-07-13 US US13/392,515 patent/US20120154593A1/en not_active Abandoned
- 2010-07-13 EP EP10811004.0A patent/EP2474162B8/en active Active
- 2010-07-13 AU AU2010286316A patent/AU2010286316B2/en not_active Ceased
- 2010-07-13 JP JP2012525813A patent/JP5806215B2/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5953056A (en) | 1996-12-20 | 1999-09-14 | Whack & Track, Inc. | System and method for enhancing display of a sporting event |
US20070058839A1 (en) | 2003-05-01 | 2007-03-15 | Jody Echegaray | System and method for capturing facial and body motion |
WO2006024078A1 (en) * | 2004-08-30 | 2006-03-09 | Trace Optic Technologies Pty Ltd | A method and apparatus of camera control |
WO2007133982A2 (en) * | 2006-05-08 | 2007-11-22 | John-Paul Cana | Multi-axis control of a device based on the wireless tracking location of a target device |
Non-Patent Citations (1)
Title |
---|
See also references of EP2474162A4 |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2497119A (en) * | 2011-12-01 | 2013-06-05 | Sony Corp | Mapping scene geometry from wide field of view image onto narrow field of view image |
GB2497119B (en) * | 2011-12-01 | 2013-12-25 | Sony Corp | Image processing system and method |
US9325934B2 (en) | 2011-12-01 | 2016-04-26 | Sony Corporation | Image processing system and method |
Also Published As
Publication number | Publication date |
---|---|
CN102598658B (en) | 2016-03-16 |
EP2474162A1 (en) | 2012-07-11 |
EP2474162B1 (en) | 2019-07-03 |
AU2010286316A1 (en) | 2012-04-19 |
US20120154593A1 (en) | 2012-06-21 |
EP2474162B8 (en) | 2019-08-14 |
CN102598658A (en) | 2012-07-18 |
JP5806215B2 (en) | 2015-11-10 |
EP2474162A4 (en) | 2015-04-08 |
AU2010286316B2 (en) | 2016-05-19 |
JP2013503504A (en) | 2013-01-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
AU2010286316B2 (en) | A method and apparatus for relative control of multiple cameras | |
US9813610B2 (en) | Method and apparatus for relative control of multiple cameras using at least one bias zone | |
JP6719465B2 (en) | System and method for displaying wind characteristics and effects in broadcast | |
US9298986B2 (en) | Systems and methods for video processing | |
EP2277305B1 (en) | Method and apparatus for camera control and picture composition | |
US7193645B1 (en) | Video system and method of operating a video system | |
KR102189139B1 (en) | A Method and System for Producing a Video Production | |
CN113873174A (en) | Method and system for automatic television production | |
US9736462B2 (en) | Three-dimensional video production system | |
Cavallaro et al. | Augmenting live broadcast sports with 3D tracking information | |
US8957969B2 (en) | Method and apparatus for camera control and picture composition using at least two biasing means | |
JPH06105231A (en) | Picture synthesis device | |
WO2018004354A1 (en) | Camera system for filming sports venues | |
EP3836081A1 (en) | Data processing method and apparatus | |
GB2559003A (en) | Automatic camera control system for tennis and sports with multiple areas of interest | |
CA2559783A1 (en) | A system and method for graphically enhancing the visibility of an object/person in broadcasting | |
WO2016032427A1 (en) | Three-dimensional video production system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201080038605.X Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10811004 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13392515 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012525813 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010286316 Country of ref document: AU Ref document number: 2010811004 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2010286316 Country of ref document: AU Date of ref document: 20100713 Kind code of ref document: A |