WO2013121215A1 - Video tracking apparatus and method - Google Patents

Video tracking apparatus and method Download PDF

Info

Publication number
WO2013121215A1
WO2013121215A1 PCT/GB2013/050368 GB2013050368W WO2013121215A1 WO 2013121215 A1 WO2013121215 A1 WO 2013121215A1 GB 2013050368 W GB2013050368 W GB 2013050368W WO 2013121215 A1 WO2013121215 A1 WO 2013121215A1
Authority
WO
WIPO (PCT)
Prior art keywords
image acquisition
acquisition device
apparatus
image
object
Prior art date
Application number
PCT/GB2013/050368
Other languages
French (fr)
Inventor
David Watkins
Original Assignee
Overview Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to GB1202692.8 priority Critical
Priority to GB201202692A priority patent/GB2499427A/en
Application filed by Overview Limited filed Critical Overview Limited
Publication of WO2013121215A1 publication Critical patent/WO2013121215A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S3/00Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
    • G01S3/78Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using electromagnetic waves other than radio waves
    • G01S3/782Systems for determining direction or deviation from predetermined direction
    • G01S3/785Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system
    • G01S3/786Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system the desired condition being maintained automatically, i.e. tracking systems
    • G01S3/7864T.V. type tracking systems
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light or radiation of shorter wavelength; Actuation by intruding sources of heat, light or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light or radiation of shorter wavelength; Actuation by intruding sources of heat, light or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light or radiation of shorter wavelength; Actuation by intruding sources of heat, light or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light or radiation of shorter wavelength; Actuation by intruding sources of heat, light or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • G08B13/19608Tracking movement of a target, e.g. by detecting an object predefined as a target, using target direction and or velocity to predict its new position
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light or radiation of shorter wavelength; Actuation by intruding sources of heat, light or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light or radiation of shorter wavelength; Actuation by intruding sources of heat, light or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light or radiation of shorter wavelength; Actuation by intruding sources of heat, light or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light or radiation of shorter wavelength; Actuation by intruding sources of heat, light or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19617Surveillance camera constructional details
    • G08B13/19632Camera support structures, e.g. attachment means, poles

Abstract

An object tracking apparatus is provided comprising a moveable unit (106); a first image acquisition device (102) mounted on the moveable unit; a second image acquisition device (104) mounted on the moveable unit and pointing in substantially the same direction as the first image acquisition device; and a control device (108). The control device is configured to determine, from image data received from the first image acquisition device, positional information of an object represented in the image data to be tracked; and output a control signal to the moveable unit to control its movement and cause the second image acquisition device to track the object in accordance with the determined positional information.

Description

Video tracking apparatus and method Field of the Invention The present invention relates to an apparatus for tracking an object and, in particular, to a video tracking apparatus.

Background of the Invention

Video tracking arrangements are known in which a PTZ (Pan/Tilt/Zoom) camera system follows a particular object such as a person, or a vehicle, and such arrangements are becoming popular for other applications for example with CCTV speed domes. These domes (or PTZ systems) track by referencing the area or object that is to be followed against its background, and this works well if the tracked area or object is small compared to the background. If this is not the case, for instance if a person is being tracked and the camera is zoomed in to see more of the person so that the background becomes relatively small, then there is not enough background to give meaningful reference points and the ability to track is lost. The above problem can be remedied by using two cameras installed separately alongside each other or some distance away from each other, a static wide angle camera to give a reference frame, and a narrow (possibly zoom) angle PTZ camera whose tracking input commands are derived from the wide angle view. This system works well but it is dependent on the distance between the cameras being known accurately and calibration of the system on initial installation is therefore necessary.

Speed dome solutions are known in which a 360° fish eye lens is suspended from the bottom of a viewing bubble which can provide tracking information to the dome PTZ system. This approach denies the view immediately beneath the dome to the PTZ camera but the fish eye lens is close enough here to provide all the information needed. In any case it is unlikely that the area immediately beneath the dome will contain a target. Alternative implementations provide multiple cameras arranged around the periphery of the PTZ unit and merge the individual views into a single view, which can also provide tracking commands. An advantage of these solutions is that the wide angle view camera is aware of the whole scene, independently of the PTZ camera which has a limited view. It is desirable to provide a tracking apparatus that can be easily installed and used to accurately track an object. Summary of the Invention According to a first aspect of the invention, there is provided an object tracking apparatus comprising a moveable unit; a first image acquisition device mounted on the moveable unit; a second image acquisition device mounted on the moveable unit; and a control device configured to: determine, from image data received from the first image acquisition device, positional information of an object represented in the image data to be tracked; and output a control signal to the moveable unit to control its movement and cause the second image acquisition device to track the object in accordance with the determined positional information. Advantageously, the first and second acquisition devices of the apparatus will move together when tracking an object.

Preferably, the second image acquisition device is mounted on the moveable unit in a fixed position relative to the first image acquisition device. Advantageously, this avoids the need for calibration of the relative locations of the first and second image acquisition devices on installation.

The control means may be configured to determine the position information from both image data received from the first image acquisition device, and the position of the first image acquisition device relative to the second image acquisition device.

The first image acquisition device and the second image acquisition device may be configured to point in substantially the same direction, or at the same scene In essence, this means that the optical axes of the first and second image acquisition devices are substantially parallel.

The predefined field of view of the first image acquisition device may be wider than that of the second image acquisition device.

The field of view of the first image acquisition device may be less than 360°.

The first image acquisition device may comprise a zoom lens operable to adjust the first predefined field of view when tracking the object. The control unit may be configured to generate control signals to control the zoom lens to adjust the field of view so as to maximise accuracy of tracking of the object. The control means may be further configured to determine the positional information based on a known angle of motion of the movable unit.

The first image acquisition device and the second image acquisition device may be housed within a closed circuit camera (CCTV) dome.

The second image acquisition device and the mounting unit may combine to form a pan tilt zoom (PTZ) camera.

The movable unit may comprise a pan and tilt gimbal mechanism.

The movable unit may comprise at least one motor controlled by the control unit.

The first and second image acquisition devices may be video acquisition devices, and the image data may comprise image data of frames of video data.

In one embodiment, one or other, or both of the first and second image acquisition devices is a digital camera which comprises a lens having an optical axis, and an image acquisition element, such as a charge coupled device, onto which an image is focussed by the lens which, for the first image acquisition device may be a zoom lens operable to adjust the first predefined field of view when tracking the object. The charge coupled device is connected to image processing circuitry in the camera which generates and outputs image data. For video, the image data may be output as a series of video frames, possibly encoded and compressed in a digital format, such as MPEG video. In an alternative embodiment, an analogue camera may be used. According to a second aspect of the invention, there is provided a method for tracking objects, the method comprising: determining, from image data received from a first image acquisition device mounted on a moveable unit, positional information of an object represented in the image data to be tracked; and outputting a control signal to the moveable unit to control its movement and cause a second image acquisition device mounted on the moveable unit to track the object in accordance with the determined positional information.

Determining positional information may comprise determining the positional information from both image data received from the first image acquisition device, and a position of the first image acquisition device relative to the second image acquisition device. According to a third aspect of the invention, there is provided a computer-readable medium comprising computer-executable instructions which, when executed, cause a processor to perform the above method for tracking objects.

Brief Description of the Drawings

The invention will now be described by way of example with reference to the accompanying drawings, in which: Figure 1 is a schematic view of an apparatus according to an embodiment of the invention. Figure 2a is a schematic view of an apparatus according to an embodiment of the invention; Figure 2b is a schematic view of an apparatus according to an embodiment of the invention.

Figure 3 is a flow chart describing the steps performed when tracking an object according to an embodiment of the invention.

Figure 4 is a flow chart describing the step of determining whether an image contains a target according to an embodiment of the invention.

Figure 5a is a flow chart describing the step of determining whether an image contains a target according to an embodiment of the invention. Figure 5b is a schematic view of an apparatus according to an embodiment of the invention.

Figure 6 is a flow chart describing the steps performed when detecting motion of a target according to an embodiment of the invention.

Detailed Description of the Drawings

The invention is now described with reference to an exemplary embodiment, as depicted in figures 1 to 6.

Figure 1 is a schematic view of an apparatus 100 according to an embodiment of the invention. The apparatus 100 includes a first camera or image acquisition device 102, a second camera or image acquisition device 104, a movable unit 106 and a control unit 108. The first camera 102 and the second camera 104 are mounted on the movable unit 106 and the first camera 102, the second camera 104 and the movable unit 106 are connected electrically with the control unit 108. The first camera 102 and the second camera 104 point in substantially the same direction, or at the same scene. In essence, this means that the optical axes of the first and second image acquisition devices are substantially parallel.

In the embodiment of the invention depicted in figure 2, the movable unit 106 is a pan and tilt gimbal mechanism (or pan and tilt shaft). The pan and tilt gimbal mechanism 106 is operated by motor 202 and another motor (not shown)through belts (not shown). In the disclosed embodiment, the first camera or image acquisition device 102 is a wide angle camera or is a camera set to a wider zoom than the second camera or image acquisition device 104 which is a high optical zoom (e.g. greater than 18, 15, 10, 9, 8 , 7 ,6 ,5 ,4, 3 or 2 times zoom). The wide angle camera 102 and the zoom camera 104 are mounted on the same gimbal mechanism 106 so that both the wide angle camera 102 and the zoom camera 104 move by the same amount when the gimbal mechanism 106 moves.

Whilst the first camera 102 (the "wide angle" camera) and the second camera 104 (the "zoom" camera) are depicted as being of a similar size in figure 2, it will be appreciated that the wide angle camera 102 can be a different to size to the zoom camera 104. The control unit 108 is a personal computer or any unit comprising a processor configured to control the movable unit. The control unit 108 is connected to both cameras and the movable unit by a wired and/or a wireless connection. Operation of the apparatus in accordance with an embodiment of the invention is now described with reference to figure 3.

At step S300, a series of images or frames l0 n is acquired by the wide angle camera 102. It will be understood that in what follows the term 'image' refers to image data, i.e. a numeric representation of a two-dimensional image, for example, a bitmap image. Similarly, a series or sequence of images means a series of frames of image data which might make up a sequence of moving video represented in video data. An image comprising/including an object should be understood to mean that a subset of the image data represents the specified object. The acquired image data are provided to the control unit 108. The image data are provided to the control unit sequentially as they are received. Alternatively, multiple images are provided to the control unit at the same time. The image data may be provided to the control unit in a compressed form, in which case the control unit decompresses the image data before preforming further processing.

At step S302, the control unit 108 determines whether any one of the received images from the image data or frames in the image or video data acquired from the first image acquisition device 102 comprise one or more targets to be tracked. A target may, for example, comprise a person, a vehicle, an animal etc. This step may be performed automatically or manually by a user of the apparatus, as discussed in more detail with respect to figures 4 and 5. If it is determined that none of the received images comprise a target to be tracked, the control unit 108 causes the gimbal mechanism 106 to return to a reference position or to remain at the current position and processing returns to step S300. Alternatively, the control unit 108 may cause the gimbal mechanism 106 to move to any other position, for example the control unit 108 may cause the gimbal mechanism 106 to move to a position specified by a user input. If an image or frame of the series of images or video data is determined to comprise a target to be tracked, processing proceeds to step S304 at which the motion of the target is determined. The motion of the target is determined using any motion detection means, such as (but not limited to) appearance based matching (i.e. colour, image gradients and other higher order visual features). The motion of the target is determined by comparing the relative position of the target in the first image I, in which a target is identified (which should be considered to be a template window) to the relative position of the target in a subsequent image li+1 received from the wide angle camera 102. The step of determining the motion of the target is described in more detail with reference to figure 6. At step S306, the control unit 108 causes the movable unit 106 to move in accordance with the determined target motion in order to track the identified target. The control unit generates input commands based on the determined motion to tracking software which then causes the gimbal mechanism to move a required amount (e.g. by a specified angular distance) to ensure that the target is in view of the wide angle camera 102. The control unit instructs the gimbal mechanism to move such that the target is centred in the image acquired by the wide angle camera 102. In this manner the wide angle camera 102 tracks the motion of the target, whilst the zoom camera 104 acquires a detailed image of the target. If the target is determined to be stationary (i.e. the motion of the target is determined to be less than a threshold amount), the gimbal mechanism is maintained at a fixed position.

In a particular embodiment of the invention, the control unit implements a high resolution 2 axis motor control algorithm based on the determined motion of the object to drive the movable unit 106 so that the target is maintained in the view of both the mounted cameras.

As discussed above, both the wide angle camera 102 and the zoom camera 104 are mounted on the gimbal mechanism 106. Accordingly, movement of the gimbal mechanism to track the identified target based on the image received by the wide angle camera 102 ensures that the target is also in the view of the zoom camera 104.

The zoom camera 104 then acquires a narrow field image containing the identified object. This narrow field, or zoom, image provides a more detailed or closer view of the object to be tracked. This is because the zoom camera 104 is focused (or zoomed) to see more of the object and less of the background. Generally, this focussing or zooming is achieved through optical means, such as a zoom lens (see below), on the zoom camera 104, in comparison to a wider angle type lens on the wide angle camera 102. The wide angle camera 102 has a narrow field of view compared to a 360° fish eye lens camera. Additionally or alternatively, the wide angle camera 102 may comprise a zoom lens that can be used to adjust the field of view of the wide angle camera 102 for distance use. The wide angle camera 102 therefore allows for accurate tracking of an object, even when the target is distant from the tracking apparatus 100.

In this manner, the control unit 108 can determine, from the image acquired by the wide angle camera 102, the background features that are stationary relative to the object being tracked. Based on this information, the control unit 108 can 'unwrap' the image and move the gimbal mechanism 106 in order to substantially centre the wide angle camera 102 and the zoom camera 104 on the identified target. The term 'unwrap' means that the control unit makes continual adjustments for the movement of the reference background in the field of view of the first image acquisition device, and then searches for one or more reference points (new, similar or identical) so as to continue to implement the control process, as previous reference points disappear from the first image acquisition device's wider field of view.

Since the wide angle camera 102 and the zoom camera 104 are fixed to the same gimbal mechanism 106, calibration of the cameras can be performed during manufacture, thereby avoiding the need for calibration during installation of the tracking apparatus 100. Step S304 is now described in more detail with reference to figures 4 and 5. In an embodiment of the invention, one or both of the methods described with reference to figures 4 and 5 are used to determine if the image received from the wide angle camera 102 comprises a target to be tracked.

At step S400 of figure 4, the control unit 108 determines whether a received image comprises an object to be tracked using image processing means, for example: single image object recognition algorithms, motion based background subtraction algorithms, edge detection, or feature extraction. Additionally or alternatively, the target may be identified in accordance with a user input in which a display unit and input means (see below) are connected to the control unit 108 and the input means can be manipulated by a user to identify an object, and generate object recognition data for the control unit corresponding to the reference point in the field of view of the first image acquisition device of the object to be tracked. The object recognition data stores identifying characteristics of the object to be tracked and is continually applied to the image data from the zoom camera 104 to determine the object's position in the image represented by the image data. The control unit then outputs control signals to the movable unit 106 to cause the object to be centred in the image. This way, the object will be tracked as it moves across the scene which is being acquired by the wide angle camera 102..

In the embodiment of the invention described with reference to figure 5b, the apparatus 100 additionally comprises a display unit 500 and input means 502 connected to the control unit, as shown in figure 5a. The display unit 500 and the input means 502 are connected to the control unit via one or both of a wired and a wireless connection. The display unit comprises any means suitable for displaying image data for example, a monitor of a personal computer, a television screen or a touch screen monitor. The input means 502 comprises any means for inputting a user selection, e.g., a mouse, a keyboard, etc.

At step S500, the control unit 108 causes the initial image data l0 to be displayed on the display unit 500. At step S502, the control unit 108 receives an input from a user via the input means 502. This input identifies a subset of the image data l0, the identified subset of data corresponding to the target to be tracked. The input may, for example, comprise a user using the input means to draw a box around the image data corresponding to the target, or using the input means to identify or select the image data corresponding to the target by some other means. Additionally or alternatively, the input may comprise the user visually identifying a target in the image data and using the input means to input a command to move the gimbal mechanism in order to centre the wide angle camera 102 on the identified target.

The control unit 108 determines (or causes a processor to determine) the motion of the target to be tracked as described in Figure 6. At step S600, the control unit 108 identifies a reference object R0 in the background of the image data I,. In this case, the background of the image data comprises image data other than the data identified as corresponding to the target. The reference object R0 is a subset of the background image data corresponding to a stationary identifiable object in the background of the image, for example data corresponding to a road, a kerb, a tree, a part of a building etc.

At step S602, the data corresponding to the target is separated or extracted from the background data of image I,. This extraction is performed using a segmentation technique, e.g. appearance based matching based on colour, image gradients and/or other higher order features. At step S604 the location of the target relative to the reference object R0 is determined.

Steps S602 and S604 are then repeated using the subsequently received image data If it is determined that the subsequently received image data li+1 does not comprise a target, then processing continues at step S306, at which the control unit may cause the gimbal mechanism to remain stationary or to return to a central or initial position.

At S606 the motion of the target is determined from the difference between the relative position of the target to the reference object R0 in image data li+1 and the relative position of the target to the reference object R0 in image data I,. The motion detection algorithm can additionally take into account physical constraints such as continuous motion (i.e. predicting where the target will be in the next frame given the previously observed motion), for example through the use of particle filtering or bootstrap techniques.

The entire tracking apparatus 100 is enclosed in a housing 170, for example a CCTV speed dome 'bubble'. The control unit 108 may also be enclosed (partially, but preferably wholly) within the housing 170, and preferably attached to the movable unit 106 (e.g. gimbal mechanism). The control unit 108 may preferably be integrated into one or other of the image acquisition devices and be embedded with the image processing circuitry of the one or other image acquisition devices. When the apparatus is placed inside a speed dome 'bubble', the wide angle camera view will be distorted because it will not be looking through a central axis of the bubble. However, as described, the image data acquired by the wide angle camera 102 is only required to allow the control unit 108 to determine tracking input (or commands) for causing the gimbal mechanism 106 to track the target. The image data acquired by the wide angle camera 102 is not required to provide a detailed view of the target because such a view is instead obtained by the image data acquired by the zoom camera 104. As described above, the control unit 108 only requires relative positional information in order to determine the movement required by the gimbal mechanism 106. The distortion of the image acquired by the wide angle camera 102 is constant and therefore does not prevent the control unit from accurately determining the required relative positional information.

The present invention has been described above in exemplary form with reference to the accompanying figures which represent embodiments of the invention. It will be understood that there are further embodiments of the invention that also fall within the scope of the invention as defined by the following claims.

Claims

Claims
1. An object tracking apparatus comprising:
a moveable unit;
a first image acquisition device mounted on the moveable unit;
a second image acquisition device mounted on the moveable unit; and
a control device configured to:
determine, from image data received from the first image acquisition device , positional information of an object represented in the image data to be tracked; and
output a control signal to the moveable unit to control its movement and cause the second image acquisition device to track the object in accordance with the determined positional information.
2. The apparatus of claim 1 , wherein the second image acquisition device is mounted on the moveable unit in a fixed position relative to the first image acquisition device.
3. The apparatus of claim 2, wherein the control means is configured to determine the position information from both image data received from the first image acquisition device, and the position of the first image acquisition device relative to the second image acquisition device.
4. The apparatus of any one of the preceding claims, wherein the first image acquisition device and the second image acquisition device are configured to point in substantially the same direction towards a scene for image acquisition.
5. The apparatus of any one of the preceding claims, wherein the first image acquisition device has a first predefined field of view, and the second image acquisition device has a second predefined field of view, the first predefined field of view being wider the second predefined field of view.
6. The apparatus of claim 5, wherein the first predefined field of view is less than 360°.
7. The apparatus of any one of the preceding claims, wherein the first image acquisition device comprises a zoom lens operable to adjust the first predefined field of view when tracking the object.
8. The apparatus of claim 7, wherein the control unit is configured to generate control signals to control the zoom lens to adjust the field of view so as to maximise accuracy of tracking of the object.
9. The apparatus of any of the preceding claims, wherein the control means is further configured to determine the positional information based on a known angle of motion of the movable unit.
10. The apparatus of any of the preceding claims, wherein the first image acquisition device and the second image acquisition device are housed within a closed circuit camera (CCTV) dome.
1 1. The apparatus of any of the preceding claims, wherein a combination of the second image acquisition device and the mounting unit is a pan tilt zoom (PTZ) camera.
12. The apparatus of any of the preceding claims, wherein the movable unit comprises a pan and tilt gimbal mechanism.
13. The apparatus of any one of the preceding claims, wherein the movable unit comprises at least one motor controlled by the control unit.
14. The apparatus of any one of the preceding claims, wherein the first and second image acquisition devices are video acquisition devices, and the image data comprises image data of frames of video data.
15. A method for tracking objects, comprising:
determining, from image data received from a first image acquisition device mounted on a moveable unit, positional information of an object represented in the image data to be tracked; and outputting a control signal to the moveable unit to control its movement and cause a second image acquisition device mounted on the moveable unit to track the object in accordance with the determined positional information.
16. The method of claim 15, wherein determining positional information comprises determining the positional information from both image data received from the first image acquisition device, and a position of the first image acquisition device relative to the second image acquisition device.
17. A computer-readable medium comprising computer-executable instructions which, when executed, cause a processor to perform the steps of claim 15 or claim 16.
18. An apparatus substantially as hereinbefore described with reference to any one of figures 1 to 6.
19. A method substantially as hereinbefore described with reference to any one of figures 1 to 6.
PCT/GB2013/050368 2012-02-16 2013-02-15 Video tracking apparatus and method WO2013121215A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB1202692.8 2012-02-16
GB201202692A GB2499427A (en) 2012-02-16 2012-02-16 Video tracking apparatus having two cameras mounted on a moveable unit

Publications (1)

Publication Number Publication Date
WO2013121215A1 true WO2013121215A1 (en) 2013-08-22

Family

ID=45939740

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2013/050368 WO2013121215A1 (en) 2012-02-16 2013-02-15 Video tracking apparatus and method

Country Status (2)

Country Link
GB (1) GB2499427A (en)
WO (1) WO2013121215A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9498678B2 (en) 2014-07-11 2016-11-22 ProSports Technologies, LLC Ball tracker camera
US9571903B2 (en) 2014-07-11 2017-02-14 ProSports Technologies, LLC Ball tracker snippets
US9591336B2 (en) 2014-07-11 2017-03-07 ProSports Technologies, LLC Camera feed distribution from event venue virtual seat cameras
US9655027B1 (en) 2014-07-11 2017-05-16 ProSports Technologies, LLC Event data transmission to eventgoer devices
US9699523B1 (en) 2014-09-08 2017-07-04 ProSports Technologies, LLC Automated clip creation
US9729644B1 (en) 2014-07-28 2017-08-08 ProSports Technologies, LLC Event and fantasy league data transmission to eventgoer devices
US9760572B1 (en) 2014-07-11 2017-09-12 ProSports Technologies, LLC Event-based content collection for network-based distribution

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150042574A (en) * 2013-10-11 2015-04-21 엘지전자 주식회사 Mobile terminal and method for controlling thereof
CN106506969B (en) * 2016-11-29 2019-07-19 Oppo广东移动通信有限公司 Camera module, the method and electronic equipment that portrait tracking is carried out by it

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6215519B1 (en) * 1998-03-04 2001-04-10 The Trustees Of Columbia University In The City Of New York Combined wide angle and narrow angle imaging system and method for surveillance and monitoring
US6326994B1 (en) * 1997-01-22 2001-12-04 Sony Corporation Matched field-of-view stereographic imaging apparatus
US20050185945A1 (en) * 2002-10-22 2005-08-25 Xiaolin Zhang Bionic automatic vision and line of sight control system and method

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4188394B2 (en) * 2005-09-20 2008-11-26 フジノン株式会社 Monitoring camera apparatus and a surveillance camera system
JP2009284452A (en) * 2008-05-23 2009-12-03 Advas Co Ltd Hybrid video camera imaging apparatus and system
JP5229726B2 (en) * 2008-08-20 2013-07-03 国立大学法人東京工業大学 Long-distance target exploration camera system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6326994B1 (en) * 1997-01-22 2001-12-04 Sony Corporation Matched field-of-view stereographic imaging apparatus
US6215519B1 (en) * 1998-03-04 2001-04-10 The Trustees Of Columbia University In The City Of New York Combined wide angle and narrow angle imaging system and method for surveillance and monitoring
US20050185945A1 (en) * 2002-10-22 2005-08-25 Xiaolin Zhang Bionic automatic vision and line of sight control system and method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
BRUNO LAIN ET AL: "Robust object tracking in a dual camera sensor", HUMAN SYSTEM INTERACTIONS (HSI), 2011 4TH INTERNATIONAL CONFERENCE ON, IEEE, 19 May 2011 (2011-05-19), pages 150-157, XP031998601, DOI: 10.1109/HSI.2011.5937358 ISBN: 978-1-4244-9638-9 *
None

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9498678B2 (en) 2014-07-11 2016-11-22 ProSports Technologies, LLC Ball tracker camera
US9571903B2 (en) 2014-07-11 2017-02-14 ProSports Technologies, LLC Ball tracker snippets
US9591336B2 (en) 2014-07-11 2017-03-07 ProSports Technologies, LLC Camera feed distribution from event venue virtual seat cameras
US9655027B1 (en) 2014-07-11 2017-05-16 ProSports Technologies, LLC Event data transmission to eventgoer devices
US9760572B1 (en) 2014-07-11 2017-09-12 ProSports Technologies, LLC Event-based content collection for network-based distribution
US9729644B1 (en) 2014-07-28 2017-08-08 ProSports Technologies, LLC Event and fantasy league data transmission to eventgoer devices
US9699523B1 (en) 2014-09-08 2017-07-04 ProSports Technologies, LLC Automated clip creation

Also Published As

Publication number Publication date
GB2499427A (en) 2013-08-21
GB201202692D0 (en) 2012-04-04

Similar Documents

Publication Publication Date Title
US8995785B2 (en) Light-field processing and analysis, camera control, and user interfaces and interaction on light-field capture devices
US7750936B2 (en) Immersive surveillance system interface
US8692881B2 (en) System and method for correlating camera views
EP1765014B1 (en) Surveillance camera apparatus and surveillance camera system
US8860760B2 (en) Augmented reality (AR) system and method for tracking parts and visually cueing a user to identify and locate parts in a scene
JP4715909B2 (en) An image processing apparatus and method, an image processing system, and image processing program
JP4451122B2 (en) Video tracking system and method
US20050084179A1 (en) Method and apparatus for performing iris recognition from an image
CN100531373C (en) Video frequency motion target close-up trace monitoring method based on double-camera head linkage structure
US20060187305A1 (en) Digital processing of video images
US20060028550A1 (en) Surveillance system and method
US7336297B2 (en) Camera-linked surveillance system
US20150178930A1 (en) Systems, methods, and apparatus for generating metadata relating to spatial regions of non-uniform size
WO2016048017A1 (en) Transmission of three-dimensional video
US7806604B2 (en) Face detection and tracking in a wide field of view
JP6109185B2 (en) Control based on the map
US20040119819A1 (en) Method and system for performing surveillance
US8180107B2 (en) Active coordinated tracking for multi-camera systems
US20060215031A1 (en) Method and system for camera autocalibration
US20080129844A1 (en) Apparatus for image capture with automatic and manual field of interest processing with a multi-resolution camera
Senior et al. Acquiring multi-scale images by pan-tilt-zoom control and automatic multi-camera calibration
EP1981278A1 (en) Automatic tracking device and automatic tracking method
US20160379061A1 (en) Methods, devices and systems for detecting objects in a video
EP1914682B1 (en) Image processing system and method for improving repeatability
US7239719B2 (en) Automatic target detection and motion analysis from image data

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13707902

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct app. not ent. europ. phase

Ref document number: 13707902

Country of ref document: EP

Kind code of ref document: A1