US20220236359A1 - Cooperative automatic tracking - Google Patents
Cooperative automatic tracking Download PDFInfo
- Publication number
- US20220236359A1 US20220236359A1 US17/720,086 US202217720086A US2022236359A1 US 20220236359 A1 US20220236359 A1 US 20220236359A1 US 202217720086 A US202217720086 A US 202217720086A US 2022236359 A1 US2022236359 A1 US 2022236359A1
- Authority
- US
- United States
- Prior art keywords
- antenna
- tracking
- antennas
- beacon
- wide angle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 claims abstract description 52
- 230000005855 radiation Effects 0.000 claims abstract description 21
- 230000007246 mechanism Effects 0.000 claims description 33
- 230000008569 process Effects 0.000 claims description 6
- 238000010801 machine learning Methods 0.000 claims description 2
- 230000003292 diminished effect Effects 0.000 claims 1
- 238000004091 panning Methods 0.000 description 18
- 238000005259 measurement Methods 0.000 description 12
- 230000033001 locomotion Effects 0.000 description 10
- 238000010586 diagram Methods 0.000 description 6
- 230000000694 effects Effects 0.000 description 6
- 239000013598 vector Substances 0.000 description 6
- 230000008901 benefit Effects 0.000 description 5
- 238000009826 distribution Methods 0.000 description 5
- JTJMJGYZQZDUJJ-UHFFFAOYSA-N phencyclidine Chemical compound C1CCCCN1C1(C=2C=CC=CC=2)CCCCC1 JTJMJGYZQZDUJJ-UHFFFAOYSA-N 0.000 description 5
- 150000003071 polychlorinated biphenyls Chemical class 0.000 description 5
- 230000001133 acceleration Effects 0.000 description 4
- 230000009471 action Effects 0.000 description 4
- 230000008859 change Effects 0.000 description 4
- 230000004048 modification Effects 0.000 description 4
- 238000012986 modification Methods 0.000 description 4
- 239000007787 solid Substances 0.000 description 4
- 230000007423 decrease Effects 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 239000002184 metal Substances 0.000 description 2
- 230000006641 stabilisation Effects 0.000 description 2
- 238000011105 stabilization Methods 0.000 description 2
- 238000012935 Averaging Methods 0.000 description 1
- 241000283153 Cetacea Species 0.000 description 1
- 206010013647 Drowning Diseases 0.000 description 1
- 241000282412 Homo Species 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000004888 barrier function Effects 0.000 description 1
- 230000000903 blocking effect Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 239000011248 coating agent Substances 0.000 description 1
- 238000000576 coating method Methods 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 230000001627 detrimental effect Effects 0.000 description 1
- 210000005069 ears Anatomy 0.000 description 1
- 230000005670 electromagnetic radiation Effects 0.000 description 1
- 238000013213 extrapolation Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000010287 polarization Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 230000000087 stabilizing effect Effects 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 238000002604 ultrasonography Methods 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S3/00—Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
- G01S3/02—Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using radio waves
- G01S3/14—Systems for determining direction or deviation from predetermined direction
- G01S3/28—Systems for determining direction or deviation from predetermined direction using amplitude comparison of signals derived simultaneously from receiving antennas or antenna systems having differently-oriented directivity characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2413—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
-
- G06K9/627—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/40—Scenes; Scene-specific elements in video content
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/02—Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
- G11B27/031—Electronic editing of digitised analogue information signals, e.g. audio or video signals
- G11B27/034—Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01Q—ANTENNAS, i.e. RADIO AERIALS
- H01Q3/00—Arrangements for changing or varying the orientation or the shape of the directional pattern of the waves radiated from an antenna or antenna system
- H01Q3/02—Arrangements for changing or varying the orientation or the shape of the directional pattern of the waves radiated from an antenna or antenna system using mechanical movement of antenna or antenna system as a whole
- H01Q3/08—Arrangements for changing or varying the orientation or the shape of the directional pattern of the waves radiated from an antenna or antenna system using mechanical movement of antenna or antenna system as a whole for varying two co-ordinates of the orientation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
Definitions
- FIG. 1 shows an illustration of a traditional antenna based tracking system featuring three antennas.
- FIG. 2 is a schematic drawing explaining the operation of a traditional radar based tracking system with two directional antennas.
- FIG. 3 is a graph showing patch antenna gain versus signal's angle of incidence.
- FIG. 4 is schematic representation of an antenna arrangement of an automated cooperative object tracking system in relation to a beacon according to a preferred embodiment of the present invention.
- FIG. 5 is a schematic depiction of the angular distribution of gains of an antenna pair according to a first preferred embodiment of the present invention.
- FIG. 6 is a schematic depiction of the angular distribution of gains of three antennas according to another preferred embodiment of the present invention
- FIG. 7 is a graph depicting the change of the angular location of a beacon orbiting a pointing device as detected according to the inventive method hereof and as deduced from GPS location detection data.
- FIG. 8 is a perspective view of a pan and tilt tracking unit of an automated cooperative object tracking system having two pairs of directional solid state antennas according to a preferred embodiment of the present invention.
- FIG. 9 is a schematic drawing of a wiring diagram of antennas and other components of an automated cooperative object tracking system according to a preferred embodiment of the present invention.
- FIG. 10 is a flow diagram of a method of orientation according to a preferred embodiment of the present invention.
- FIG. 11 is a flow diagram of an automated editing and publishing method of the footage recorded by the inventive system.
- This present invention is related to the field of orienting a pointing device at a beacon.
- the present invention is also related to the field of automatic cooperative object tracking (COP).
- the present invention is also related to the field of automatic video recording using line of sight (LOS) technology.
- This present invention is also related to monopulse amplitude comparison based radar tracking,
- a pointer also referred to as a pointing device
- a pan-tilt mechanism In a system for cooperative tracking of an object, a pointer (also referred to as a pointing device) is associated with a pan-tilt mechanism. Two (or more) antennas are associated with the pointer.
- a beacon is associated with an object to be tracked. The beacon emits an identifiable signal detectable by the antennas.
- a microcontroller compares the gains of the two antennas causes the pan-tilt mechanism to turn in the direction of the antenna with the higher gain.
- pairs of small solid state antennas are oriented in substantially opposite directions.
- FIG. 1 shows an illustration of a traditional antenna based tracking system featuring three antennas.
- the direction of an incoming wave signal can be determined using directional antennas.
- the traditional Monopulse Amplitude Comparison (MAC) method is illustrated by FIG. 1 and FIG. 2 .
- the application of this method requires the use of large, highly directional antennas, such as those depicted in FIG. 1 .
- the antenna system depicted in FIG. 1 is very large and is only transportable with the use of a tractor-trailer due to its size.
- An example of the antenna system depicted in FIG. 1 is described in the following publication: C. T. Nadovich, J. F. Aubin, D. R. Frey, An Instrumentation Radar System for Use in Dynamic Signature Measurements (Jun. 1, 1992) (available at ⁇ http://www.microwavevision.com/sites/www.microwavevision.com/files/files/ORBIT-FR-InstrumentationRadarSystem-92-06-02-Nadovich_0
- FIG. 2 is a schematic drawing explaining the operation of a traditional radar based tracking system with two directional antennas.
- FIG. 2 shows how two radar antennas are oriented such that the angle between their maximum gain vectors is only a few degrees.
- Gain values of a received signal are compared between the two antennas of FIG. 2 and the angle ⁇ of the incoming signal from a source denoted as “target” can be calculated as the deviation from the orientation of the system of the antennas.
- this orientation is the line of symmetry between the two gain vector distributions (Beam 1 and Beam 2 ), which is denoted as “Crossover axis” in FIG. 2 .
- the crossover axis is at the angle ⁇ .
- At least three antennas are needed; these may be thought of being grouped into two couples of antennas with orientation directions in two different (intersecting) planes. Highly directional antennas are too large to be used in small consumer electronic applications.
- An important feature of the MAC method is that the antennas are oriented in just slightly different directions, ⁇ s in FIG. 2 . Here the size of the angle ⁇ s is tied to the directionality (acceptance angle) of the antennas such that there is some overlap between the gain curves Beam 1 and Beam 2 of the antennas.
- the present invention is an implementation of line of sight (“LOS”) technology for cooperative object tracking (“COT”).
- LOS line of sight
- COT cooperative object tracking
- One important applications of the cooperative object tracking described herein is with automated video recording of freely moving subjects.
- a subject is equipped with a “remote device” that may be carried or worn that is also a radiation transmitter.
- the “remote device” may also be referred to herein as a beacon or target.
- an automated cooperative object tracking system preferably comprises one or more receiver devices that receive the transmission of the beacon; the information contained therein is used to orient a pointing device, such as a camera, at the beacon and, by implication, at the subject.
- a multiplicity of beacons and/or a multiplicity of pointing devices may be used within the automated cooperative object tracking system.
- One of the disadvantages of using highly directional antennas for object tracking is that if the object is significantly away from the direction of the antenna, the object is difficult to locate.
- the target is located at a direction where the gain of at least one of the antennas is useful to detect it. Otherwise, the system shown in FIG. 2 would not be able to locate the target easily, if at all.
- the present invention is advantageous in that the object is never “lost” because the antennas employed are not narrowly oriented as they are in FIG. 2 .
- the inventive system uses two small (centimeter sized) solid state antennas facing very different directions (in some cases close to 180 degrees (i.e., close to opposite directions) while in other cases the angle may be as small as 50 degrees) to determine which of the pair receives the transmitted signal stronger.
- An example of the antenna that may be used is a patch antenna.
- a patch antenna is a type of radio antenna with a low profile mountable on a flat surface.
- patch antennas have a flat, rectangular sheet or “patch” of metal mounted over a larger sheet of metal called a ground plane.
- a typical patch antenna gain pattern is shown in FIG. 3 .
- FIG. 3 illustrates patch antenna gain vs. signal's angle of incidence.
- FIG. 3 illustrates patch antenna gain vs. signal's angle of incidence.
- the over 60 degree wide (3 dB drop-off) maximum gain width is too broad for traditional MAC techniques.
- the inventive solution of the present invention only requires a steep gain slope which is a characteristic for small patch antennas at about 90 degrees from the surface normal. In FIG. 3 the steep gain slopes are at about +200 degrees and at about +340 degrees (which also may be thought of as ⁇ 20 degrees). Also, patch antennas may be specially designed to have a sharp drop off in gain at other particular angles.
- the antennas constitute a pair (in other words, the antennas are paired), when the two antennas are directed in substantially different directions such that in the direction of the half angle between the antennas' orientation the angular gradient of the gains of the antennas is maximum.
- the antennas are paired, there is naturally no gain overlap except within and near to the common plane of the axes (common central plane) of the paired antennas. This is important in those instances when the beacon is not within this plane or close to it. If this condition does not hold for a particular make and/or model of antennas, one can add additional features to their mounting to ensure that the no gain overlap substantially outside of the common central plane is fulfilled.
- the invention described herein uses as an example radio frequency (RF) radiation being employed by the beacon. While this choice implies the use of particular equipment (RF antennas, etc.) it is not implied that using other type of radiation is not within the scope of the present invention.
- RF radio frequency
- IR infrared
- ultrasonic radiation sources and detectors are available and it is a matter of technological detail and choice as to which type of radiation is best to employ.
- the wavelength choice within the RF band is of some significance; free availability of off the shelf equipment that does not require additional certification may be balanced by the desire of choosing wavelengths at which reflection effects (multipath errors) are minimal.
- Additional mounting and shielding techniques may be used to cause the gain of the antenna to drop sharply at a particular angle.
- mechanical barriers may be used such as extending the printed circuit board (“PCB”) on which a patch antenna is mounted to block signals past a desired angle (e.g., 90 degrees).
- the directionality of the antenna is not important. That is to say, it is not important in the same way as in the case of the example shown in FIG. 2 where a narrow maximum gain is necessary in the direction of the antenna.
- the sharp drop-off of the gain along a particular direction is the requirement.
- the important feature is that the antenna's gain drops sharply as the signal's angle of incidence passes a certain angle.
- FIG. 4 is schematic representation of an antenna arrangement of an automated cooperative object tracking system in relation to a beacon according to a preferred embodiment of the present invention.
- FIG. 4 shows a top view of a preferred embodiment of automated cooperative object tracking system 100 .
- automated cooperative object tracking system 100 is used to orient camera 25 at a beacon 60 .
- Beacon 60 is also a radiation transmitter.
- Automated cooperative tracking system 100 comprises a pair of patch antennas 20 and 30 .
- the patch antennas are mounted on two sides of panning unit 10 that can turn about an axis A (axis A is perpendicular to the plane of the drawing).
- Antennas 20 and 30 are mounted on PCBs 40 and 50 , respectively.
- an extended PCB is a PCB that has modifications of, for example, the size, thickness, coating, etc., to modify the gain profile of the attached antenna
- Antenna 20 receives signal from beacon 60 at an angle ⁇ 1 that is less than the drop off angle while antenna 30 receives the signal from an angle ⁇ 2 that exceeds the drop off angle.
- antenna 20 receives a stronger signal than antenna 30 and a processing unit/microcontroller (not shown) compares the gain levels of the received signals by each antenna using, for example, the received signal strength indicator (RSSI) method.
- RSSI received signal strength indicator
- the microcontroller will direct a turning mechanism (not shown) to turn panning unit 10 about axis A and rotate panning unit 10 in an attempt to keep the gain levels of the two antennas the same, i.e., in a direction that minimizes the difference between the signal intensities detected by antenna 20 and antenna 30 (in counterclockwise direction in the case of the example of FIG. 4 ).
- FIG. 5 is a schematic depiction of the angular distribution of gains of an antenna pair according to a preferred embodiment of the present invention.
- FIG. 5 illustrates the operation of the automated cooperative tracking system of FIG. 4 further.
- the orientation of the antennas is tied to the orientation of the camera 25 .
- the orientation of the optical axis of camera 25 is defined as 0 degrees.
- the panning unit turns in the direction of the antenna with the higher gain.
- the antenna with the higher gain turns away from the beacon and the antenna with the lower gain turns toward the beacon.
- the relationship between gain and the turning of the camera can be generally stated as follows: If the left antenna gain is greater than the right antenna gain, then the camera is rotated to the left; if the right antenna gain is greater than the left antenna gain, then the camera is rotated to the right.
- the two antennas may have slightly different gain patterns.
- a calibration procedure solves this issue.
- the calibration may be done by placing a transmitter directly in front of the panning unit (at 0 degrees, by definition) and taking measurements of the received signal strength of each antenna.
- the two antennas would receive the transmissions with equal strength when the transmitter is directly in front of the panning unit.
- the strength of the received signal may be different between the two antennas when the transmitter is directly in front of the panning unit.
- Using the calibration values accounts for the differences in gain patterns.
- the antenna arrangement illustrated in FIG. 4 and FIG. 5 has the following deficiency: if the beacon is in the space hemisphere that is behind the camera (for example, close to 180 degrees) and there is a similar crossover of the antenna gains at another degree, for example at 180 degrees, then the panning device will turn the camera to this direction, which is not (and may be even the opposite of) the camera direction. This is not a problem if the system is always used for following targets that are in the front hemisphere with respect to the camera. In a preferred embodiment of the present invention, this problem is resolved by using three or more antennas in the same plane. This arrangement is illustrated by FIG. 6 .
- FIG. 6 is a schematic depiction of the angular distribution of gains of three antennas according to another preferred embodiment of the present invention.
- the three antennas are shown schematically having similar gain patterns that provide relatively high gains within a window of about 150 degrees.
- the gain windows have sharp edges, where the gain falls off and Antenna 1 and Antenna 2 are mounted such that their overlapping edges are at angle 0 degrees, which is the direction of the camera.
- Antenna 3 is mounted such that it has relatively wide overlaps both with Antenna 1 and Antenna 2 .
- both Antenna 2 and Antenna 3 register signal from Beacon 2 .
- the tracking unit will turn to the right and this process will continue until Antenna 1 begins to register Beacon 2 and further until the signals on Antenna 1 and Antenna 2 are equal.
- Beacon 3 Only Antenna 3 registers signal from Beacon 3 .
- the tracking unit will begin to turn in a preprogrammed direction (either to the right, or to the left). Eventually, Beacon 3 will register on Antenna 1 (if the unit is turning to the left) or on Antenna 2 (if the unit is turning to the right). The turning will continue until Beacon 3 registers on both Antenna 1 and Antenna 2 and further until the signals on Antenna 1 and Antenna 2 are equal.
- the tracking unit of the automated cooperative tracking system may be programmed to sense and register whether the gain of Antenna 3 increases or decreases after the tracking unit first starts turning. If the gain increases, the tracking unit reverses its turning direction, but if the gain decreases, the tracking unit will keep turning in the initial direction. This modification may decrease the time that elapses between first registering Beacon 3 and finally having the camera oriented at this beacon.
- the antenna arrangement of the automated cooperative tracking system of the present invention is not limited to three antennas.
- PBGA Paired Broad Gain Antenna systems
- PBGA Paired Broad Gain Antenna systems
- PBGA Paired Broad Gain Antenna systems
- two PBGA systems must be employed preferably arranged orthogonally (but orthogonal arrangement is not necessary).
- Antenna 1 and Antenna N constitute the antennas that make it a paired antenna system
- FIG. 7 is a graph depicting the change of the angular location of a beacon orbiting a pointing device as detected according to the inventive method hereof and as deduced from GPS location detection data.
- the graph of FIG. 7 shows experimental data regarded as proof of concept for the automated cooperative object tracking system and method of the present invention.
- a beacon was equipped with both a radio transmitter and a GPS antenna.
- a pair of off-the-shelf TO-Link® TL-ANT2409A antennas were used as the receivers, in keeping with the arrangement illustrated in FIG. 4 .
- the antennas were stationary and the beacon was moved around to position it at different angles with respect to the antennas.
- the angular positions of the beacon were recorded using the GPS signal and are shown as black squares.
- the apparent angular positions of the beacon were also measured using the inventive method and are shown as empty circles in FIG. 7 . Considering that there were no measures taken to filter out electronic and other noise, or to optimize the apparatus in any respect, the data show an acceptable degree of agreement between the angular positions deduced from GPS locating and those obtained using the inventive apparatus and method.
- pan and tilt axes To orient a pointing device at a source of radiation in a three-dimensional space, one has to turn the pointing device about two axes that may be referred to as pan and tilt axes.
- the example illustrated in FIG. 4 may be regarded as either the pan portion or the tilt portion of such a pan and tilt apparatus.
- the tilt portion of the apparatus works essentially on the same principles as described in conjunction with FIG. 4 and FIG. 5 .
- a panning unit mounted on a base such as, for example, a tripod
- a tilting unit mounted on the panning unit.
- the antennas for panning are mounted on the sides of the panning unit as described and shown above in FIG. 4 , then the incoming signal from a moving radiation source that is at times at a higher elevation and at other times is at a lower elevation arrives at the antennas at varying angles. In such a situation, the characteristics of the antennas may not be the same in all these directions. This may defeat the calibration described above and may lead to inaccuracies in pointing. To avoid this, it is preferable to mount both the pan and the tilt antennas on the tilt portion of the orienting apparatus.
- FIG. 8 is a perspective view of a pan-tilt mechanism of an automated cooperative object tracking system having two pairs of directional solid state antennas according to a preferred embodiment of the present invention.
- FIG. 8 illustrates one possible mounting arrangement of two patch antenna pairs on a pan-tilt mechanism 200 .
- Panning mechanism 210 may be mounted on a tripod or other base via a rotatable shaft that permits the rotation of panning mechanism 210 around axis A shown as a dashed-dotted line.
- Panning mechanism 210 is operationally coupled to tilting mechanism 220 .
- Tilting mechanism 220 is capable of rotation about axis B, also shown as a dashed-dotted line.
- a portion of tilting mechanism 220 is shaft 225 that serves as the mounting base for patch antenna 230 , situated on the left end of shaft 225 .
- Another patch antenna is similarly mounted on the right end of shaft 225 but is not shown in the drawing.
- Bases 240 and 250 hold patch antennas 245 and 255 that provide tilt information for the microcontroller (not shown) of pan-tilt mechanism 200 .
- one of the patch antennas is mounted facing up (antenna 255 ) and the other is mounted facing down (antenna 245 , shown using dashed lines). As noted above, more antennas may be used.
- Pan-tilt mechanism 200 is also equipped with omnidirectional antenna 260 used to receive and to send radio signals. Screw 270 is used to mount a camera or other pointing device on pan-tilt mechanism 200 .
- the RF tracking method of the present invention may a have a high sampling rate but noisy output. Therefore, filtering algorithms are preferably employed to smooth the tracking data.
- the signal emitted by beacon preferably carries a code that makes it distinguishable from any otherwise similar background radiation.
- FIG. 9 is a schematic drawing of a wiring diagram of antennas and other components of an automated cooperative object tracking system according to a preferred embodiment of the present invention.
- all patch antennas are connected to a single amplifier and gain measurements are preferably carried out sequentially.
- automated cooperative object tracking system 300 comprises four patch antennas 315 , 325 , 335 , and 345 , and one non-directional antenna 350 . All patch antennas are shown together with PCBs 310 , 320 , 330 , and 340 on which they are mounted, respectively. All antennas are connected directly or via the PCB to switch 360 that connects each antenna sequentially to microcontroller 370 .
- Microcontroller 370 may preferably be of the type known as “RX MCU”.
- the gains may be measured in the following order: left-facing antenna (antenna 345 ), right-facing antenna (antenna 335 ), up-facing antenna (antenna 325 ), and down-facing antenna (antenna 315 , shown in dashed lines). This method is useful to minimize equipment cost, but it assumes that measurements may be carried out sufficiently fast.
- the system architecture shown and described in FIG. 9 may be modified such that only some antennas are grouped and queried in sequence. For example each PBGA system of a pan-tilt tracker may have its own grouping of antennas in which the antennas are read sequentially.
- Using the sequential gain measurement embodiment increases the effective time between readings by a factor of 4, to 4 ⁇ (or, if antenna 350 requires equal time, 5 ⁇ ). Whether this is acceptable for a particular purpose depends on the expected velocities of the beacons and on the other technical characteristics of the system.
- cameras that follow or track a beacon are preferably automatically turned on and off based on the beacon's proximity to the camera, or to a particular camera if more than one camera is used.
- the beacon is preferably tracked using multiple pointing devices and each pointing device itself may also be used as a beacon.
- each pointing device itself may also be used as a beacon.
- Setup of such a system comprises determining angles between the directions of any second and third pointers with respect to a first pointer and then using geometrical calculations (triangulation).
- the location of the beacon may be determined using orientation data from any two of the pointers.
- the location of the beacon may be determined using multiple sets of independent data. This opens the possibility of using such data to determine the beacon's location with high certainty and to eliminate multipath effects. Eliminating multipath effects is particularly important when such a system is used indoors where walls can reflect radio waves.
- beacon 60 is also equipped with one or more inertial measurement units (IMUs). Locating/orienting techniques are prone to errors (e.g., multipath errors) and blocked signals. These location errors can be reduced using one or more IMUs on the beacon.
- the NU measures the beacon's accelerations, and the acceleration data can be transmitted to the orienting pan and tilt unit to be used by the microcontroller to supplement the measurements of position of the beacon.
- a 3-axis accelerometer placed on the transmitter can send information relating to the magnitudes of accelerations experienced by the beacon (also called “motion indication values” [“MIVs”]).
- the filter settings on the pan and tilt unit may be adjusted in real time based on the MIVs received such that if an instantaneous location determination is far from the previous location indications, and the MIV is reporting low values of accelerations, it is likely the new signal is the result of a multi-path reflection. Such a signal should either be ignored or heavily filtered.
- a 6-axis NU meaning a 3-axis accelerometer and 3-axis gyroscope
- an estimate for distance traveled over a short period of time may be calculated.
- the transmitter associated with the beacon could tell the pan-tilt unit that it moved a determined number of meters in the last second but not the direction of this movement.
- the microcontroller would then compare the new location estimate (based, for example, on LOS reading and distance measurement) to the distance estimate and the previous location and make sure that the location reading is reasonable.
- a 9-axis IMU (3-axis accelerometer, 3-axis gyroscope, 3-axis magnetometer) is used.
- a 9-axis IMU an estimate for both distance and direction over a short period of time can be calculated.
- a data fusion algorithm combines the two pieces of information, for example if the COT method detects movement of one meter and the IMU data predicts the movement was two meters over the same timeframe, the two values are averaged and the pointing device is instructed to point at the averaged value.
- the microcontroller uses such fused information for improving the overall location determination (to either detect or filter out errors caused by multipath, or aid in extrapolation when LOS is not available due to signal blocking).
- beacon 60 preferably emits circularly polarized radiation and the patch antennas are preferably designed to detect such radiation. Radiation that does not reach the patch antennas directly, i.e., radiation that is reflected from nearby objects, such as walls, will have lost the correct polarization and will be detected less.
- the microcontroller of the automated cooperative object tracking system may be programmed to recognize movement patterns that are characteristic of certain activities. Such recognition may then be used to control recording parameters. For example, when an exciting moment is about to occur (such as when a surfer about to catch a wave), it would be desirable to have the automatic recording device change to, for example, a high resolution and faster frame rate to get better quality footage of the surfer and wave, and then revert back to a lower quality resolution and/or slower frame rate after the exciting event. If the microcontroller is programmed to detect when an exciting event is about to happen, location and velocity data can be used to detect an exciting event.
- IMU sensor data there are recognizable characteristics of IMU sensor data that can be used to detect that the surfer is likely about to catch a wave.
- the microcontroller preferably triggers the camera to record at a higher recording resolution and/or frame rate.
- One example of recognizable data would be aggressive paddling motions detected by an IMU located on the surfers arm. Paddling into a wave by a surfer is much more aggressive when compared with other paddling scenarios and occurs only for a short period of time (about one to five seconds).
- the microcontroller of the pan and tilt mechanism implements an image stabilization method that dampens high frequency vibrations but allows low frequency motions. This can be done using either mechanical or digital methods, or using a combination of both.
- cooperative object tracking is combined with video recognition (VR).
- the system starts tracking using a COT method (i.e., the subject holds a tracking beacon and IR, RF, GPS, etc., are used to track the beacon).
- the VR software automatically learns to recognize the subject that is being tracked and the system starts to track based on a combination of VR and COT. After some time, the system only needs VR to track the subject.
- the subject spends a certain amount of time using the beacon and “teaching” the system to operate based on VR, then the use of the beacon is discontinued.
- the beacon may be used less requiring less battery power permitting use of a smaller size beacon.
- a further advantage of this method is that the subject can become free from having to carry the beacon.
- the system after the system “learns” to recognize the subject, the system “learns” to recognize other subjects of the same “class”.
- class signifies, for example, a group of people engaged in similar activity (e.g., riding bicycles, surfing, etc.), or wearing similar clothing (e.g., uniforms as on a soccer team).
- Systems using class tracking may include a multiplicity of tracking cameras in a network that would follow members of the class based on their location (e.g., being within the soccer field, in the water, etc.), or other criteria (e.g., velocity, distance from the camera, etc.).
- the network of tracking cameras uses cooperative object tracking to start, but eventually uses the network's improved pixel tracking to track objects or events that are not being tracked with cooperative object tracking with the system.
- FIG. 10 is a flow diagram of a method of orientation according to a preferred embodiment of the present invention. Regarding such method, it is assumed that the tracking system is oriented nearly at the object such that the radiation signal emitted by the beacon is registered by both antennas of each antenna pair mounted on orthogonal planes.
- a beacon associated with the object to be tracked periodically transmits a signal in step 410 .
- the signal preferably incorporates a unique identifier of the beacon.
- the signal is preferably a radio wave, but it may be electromagnetic radiation from a different part of the electromagnetic spectrum (e.g., infrared light) or a sound wave (e.g., ultrasound).
- the signal transmitted from the beacon is detected by appropriate antennas associated with a pointing device, such as a video camera, having a pan-tilt mechanism, in step 420 .
- the antennas comprise two orthogonally arranged Paired Broad Gain Antenna (PBGA) systems.
- PBGA Paired Broad Gain Antenna
- the gain of each paired antenna is determined in step 430 .
- the gain difference is determined algebraically, i.e., both the sign (i.e., which antenna has the higher gain) and the magnitude of the gain difference are determined in step 440 .
- the system determines if the gain difference is nearly zero or not. If the gain difference is nearly zero, no action is taken (step 460 ). If the gain difference is not nearly zero, then the method proceeds to step 470 .
- step 470 the pan-tilt mechanism of the pointing device turns in the direction of the higher gain of each antenna pair. That is, the gain difference detected within the panning plane causes the mechanism to pan if that difference is not nearly zero and the gain difference detected in the tilt plane causes the mechanism to tilt if that difference is not nearly zero.
- the definition of “nearly zero” is equivalent to defining a deadband that is useful to avoid jittery reaction to minute changes in the detected antenna gains. Such minute changes may be due to minute changes in the position of the object tracked or may be due to system noise.
- turning velocity is controlled in part based on the magnitude of the gain difference detected between each pair of antennas. This is based on the recognition that when there is a higher difference in gain, the object orientation is farther from the current orientation of the pointer.
- a beacon associated with a person may move in such a way that the radiation intensity emitted in the direction of the receiving antennas may change rapidly. To avoid errors associated with such changes between the measurements of the gains of the antenna pair, one may want to do measurements in very quick succession, for example at a rate of 100 Hz.
- orientation commands to the pan-tilt mechanism that orient the pointing device/camera need to be provided only at a rate of about 5 Hz, for example. That would allow for an averaging of 10 gain difference readings before each turning command and a corresponding improvement of signal-to-noise ratio.
- Kalman filtering may also be employed.
- the tracking apparatus is a part of a system that comprises other similar apparatuses (automated cameramen) and also a human cameraman, or more generally, a human user in control of the system.
- the human user may or may not be a part of the action that is being filmed.
- the human user can determine when the action is worth recording or photographing.
- a problem with an automated cameramen is that they do not have the same intelligence as a human to determine when to start or stop filming. For example, automated cameramen will not inherently respond to a director yelling “action”.
- the one or more automated cameramen in the present embodiment start or stop recording (or take photos) based on the remote control of a human.
- the remote control is activated when the human user is a cameraman and uses their camera such that the automated cameramen mimics what the human cameraman is doing. For example, when the human cameraman begins recording, a signal is sent to all the automated cameramen which also start recording.
- each tracking device utilizes data from all devices in a machine learning protocol to learn from each other so that tracking uses location and biometric data compared with pixel tracking and all the other types of tracking to further optimize the tracking and video recording process.
- a network of tracking devices will act as an array of eyes and ears that automatically senses and alerts to problems, Examples are: (a) a camera “sees” in its peripheral that a car accident occurred and automatically dials 911; (b) a camera that is part of a municipal network “sees” a police officer in trouble and alerts the police department; (c) a camera is following a surfer but senses a drowning victim in the background, or even recognizes whale breaching, and alerts the network.
- the tracking apparatus is used to recognize events.
- a sporting event can be determined based on the types of motions detected (e.g., youth soccer).
- the inventive system of tracking devices may also preferably connect with a cellular company (such as Verizon) to detect their customers in the vicinity of an event based on user's cell phone location data.
- the company can reasonably assume that those customers are also interested in that type of event (e.g., youth soccer), Then anyone with a Verizon phone in the vicinity may be flagged as being interested in youth soccer and targeted for related advertising.
- video recognition algorithms may also be used for collecting additional customer data.
- FIG. 11 is a flow diagram of an automated editing and publishing method of the footage recorded by the inventive system.
- a problem of uploading video footage to a remotely located editing service is that video footage files are large and time consuming to transfer.
- An inventive way to solve this problem is illustrated using FIG. 11 .
- the user films, using the inventive system and method hereof, footage in high resolution in step 500 .
- the footage is saved on the user's device in step 510 .
- the user's device creates a low resolution version of the footage in step 520 , and uploads it to a remotely located server (computer) of an editing service in step 530 .
- the editing service then edits the low resolution footage into “appealing video clips” in step 540 .
- the editing software that the editing service uses then creates a set of specific editing instructions that are to be implemented on the high resolution version on the user's device in step 550 .
- the instructions are sent to the user and are implemented on the user's device that has editing software capable of following the editing instructions generated in step 550 and replicates the appealing video clips(s) on the user's device using the high resolution video files (step 560 ),
- the result may be reviewed by the user in step 570 .
- the user decides whether the edit is “good” or not in step 580 . If not, the editing process may be repeated by returning to step 540 .
- the user may provide editing suggestions, instructions, and the like to make the second round of editing improved.
- the user uses the high resolution edited footage in step 590 . For example, the user may upload the edited clip to a server for viewing by others.
- the remotely located video editing service may be an automated service or a service performed by humans.
- Step 580 is optional as noted by the dashed line connecting step 570 with step 590 .
- automated editing service collects feedback data to improve its editing capabilities. As more people use the service and provide feedback, the quality of the editing service improves. The feedback could be based on users modifying the edited video clips on their own computers software. Data on the modifications which the user made is sent as feedback to the editing service to improve their editing algorithms.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Data Mining & Analysis (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Artificial Intelligence (AREA)
- Life Sciences & Earth Sciences (AREA)
- Position Fixing By Use Of Radio Waves (AREA)
- Studio Devices (AREA)
- Closed-Circuit Television Systems (AREA)
- Signal Processing (AREA)
Abstract
A system and method are provided for automatic cooperative object tracking using gain comparison of antenna pairs facing different directions. In cooperative object tracking, the object is associated with a radiation source, or beacon, that emits radiation that is detected by the tracking system. The present invention makes use of antennas that are not highly oriented antennas but are characterized by having a steep drop in their gain profiles at a particular angle of incidence of the radiation that they detect.
Description
- This application is a U.S. Nonprovisional patent application Ser. No. 16/357,115, filed Mar. 18, 2019, titled “COOPERATIVE AUTOMATIC TRACKING”, which is a continuation of U.S. Nonprovisional patent application Ser. No. 14/630,524, filed Feb. 24, 2015, titled “COOPERATIVE AUTOMATIC TRACKING”, and claims the benefit of U.S. Provisional Patent Application No. 61/943,903, filed Feb. 24, 2014, titled “COOPERATIVE AUTOMATIC TRACKING”, the contents of which are hereby incorporated by reference in its entirety and are not admitted to be prior art with respect to the present invention by the mention in this cross-reference section.
-
FIG. 1 shows an illustration of a traditional antenna based tracking system featuring three antennas. -
FIG. 2 is a schematic drawing explaining the operation of a traditional radar based tracking system with two directional antennas. -
FIG. 3 is a graph showing patch antenna gain versus signal's angle of incidence. -
FIG. 4 is schematic representation of an antenna arrangement of an automated cooperative object tracking system in relation to a beacon according to a preferred embodiment of the present invention. -
FIG. 5 is a schematic depiction of the angular distribution of gains of an antenna pair according to a first preferred embodiment of the present invention. -
FIG. 6 is a schematic depiction of the angular distribution of gains of three antennas according to another preferred embodiment of the present invention -
FIG. 7 is a graph depicting the change of the angular location of a beacon orbiting a pointing device as detected according to the inventive method hereof and as deduced from GPS location detection data. -
FIG. 8 is a perspective view of a pan and tilt tracking unit of an automated cooperative object tracking system having two pairs of directional solid state antennas according to a preferred embodiment of the present invention. -
FIG. 9 is a schematic drawing of a wiring diagram of antennas and other components of an automated cooperative object tracking system according to a preferred embodiment of the present invention. -
FIG. 10 is a flow diagram of a method of orientation according to a preferred embodiment of the present invention. -
FIG. 11 is a flow diagram of an automated editing and publishing method of the footage recorded by the inventive system. - This present invention is related to the field of orienting a pointing device at a beacon. The present invention is also related to the field of automatic cooperative object tracking (COP). The present invention is also related to the field of automatic video recording using line of sight (LOS) technology. This present invention is also related to monopulse amplitude comparison based radar tracking,
- In a system for cooperative tracking of an object, a pointer (also referred to as a pointing device) is associated with a pan-tilt mechanism. Two (or more) antennas are associated with the pointer. A beacon is associated with an object to be tracked. The beacon emits an identifiable signal detectable by the antennas. A microcontroller compares the gains of the two antennas causes the pan-tilt mechanism to turn in the direction of the antenna with the higher gain. In the inventive system, pairs of small solid state antennas are oriented in substantially opposite directions.
-
FIG. 1 shows an illustration of a traditional antenna based tracking system featuring three antennas. The direction of an incoming wave signal can be determined using directional antennas. The traditional Monopulse Amplitude Comparison (MAC) method is illustrated byFIG. 1 andFIG. 2 . The application of this method requires the use of large, highly directional antennas, such as those depicted inFIG. 1 . The antenna system depicted inFIG. 1 is very large and is only transportable with the use of a tractor-trailer due to its size. An example of the antenna system depicted inFIG. 1 is described in the following publication: C. T. Nadovich, J. F. Aubin, D. R. Frey, An Instrumentation Radar System for Use in Dynamic Signature Measurements (Jun. 1, 1992) (available at <http://www.microwavevision.com/sites/www.microwavevision.com/files/files/ORBIT-FR-InstrumentationRadarSystem-92-06-02-Nadovich_0.pdf>). -
FIG. 2 is a schematic drawing explaining the operation of a traditional radar based tracking system with two directional antennas.FIG. 2 shows how two radar antennas are oriented such that the angle between their maximum gain vectors is only a few degrees. Gain values of a received signal are compared between the two antennas ofFIG. 2 and the angle □□ of the incoming signal from a source denoted as “target” can be calculated as the deviation from the orientation of the system of the antennas. By definition, this orientation is the line of symmetry between the two gain vector distributions (Beam 1 and Beam 2), which is denoted as “Crossover axis” inFIG. 2 . Thus, the crossover axis is at the angle □□. For complete direction determination at least three antennas are needed; these may be thought of being grouped into two couples of antennas with orientation directions in two different (intersecting) planes. Highly directional antennas are too large to be used in small consumer electronic applications. An important feature of the MAC method is that the antennas are oriented in just slightly different directions, □s inFIG. 2 . Here the size of the angle □s is tied to the directionality (acceptance angle) of the antennas such that there is some overlap between the gain curves Beam 1 and Beam 2 of the antennas. - The present invention is an implementation of line of sight (“LOS”) technology for cooperative object tracking (“COT”). One important applications of the cooperative object tracking described herein is with automated video recording of freely moving subjects. In such an application, a subject is equipped with a “remote device” that may be carried or worn that is also a radiation transmitter. The “remote device” may also be referred to herein as a beacon or target. To track the subject, an automated cooperative object tracking system preferably comprises one or more receiver devices that receive the transmission of the beacon; the information contained therein is used to orient a pointing device, such as a camera, at the beacon and, by implication, at the subject. As will be described further herein, a multiplicity of beacons and/or a multiplicity of pointing devices may be used within the automated cooperative object tracking system.
- One of the disadvantages of using highly directional antennas for object tracking is that if the object is significantly away from the direction of the antenna, the object is difficult to locate. In
FIG. 2 , the target is located at a direction where the gain of at least one of the antennas is useful to detect it. Otherwise, the system shown inFIG. 2 would not be able to locate the target easily, if at all. As will be described below, the present invention is advantageous in that the object is never “lost” because the antennas employed are not narrowly oriented as they are inFIG. 2 . - According to a preferred embodiment hereof, the inventive system uses two small (centimeter sized) solid state antennas facing very different directions (in some cases close to 180 degrees (i.e., close to opposite directions) while in other cases the angle may be as small as 50 degrees) to determine which of the pair receives the transmitted signal stronger. An example of the antenna that may be used is a patch antenna. A patch antenna is a type of radio antenna with a low profile mountable on a flat surface. Generally, patch antennas have a flat, rectangular sheet or “patch” of metal mounted over a larger sheet of metal called a ground plane. A typical patch antenna gain pattern is shown in
FIG. 3 .FIG. 3 illustrates patch antenna gain vs. signal's angle of incidence. InFIG. 3 , the antenna is in the center and it is oriented at □=90 degrees. The over 60 degree wide (3 dB drop-off) maximum gain width is too broad for traditional MAC techniques. The inventive solution of the present invention only requires a steep gain slope which is a characteristic for small patch antennas at about 90 degrees from the surface normal. InFIG. 3 the steep gain slopes are at about +200 degrees and at about +340 degrees (which also may be thought of as −20 degrees). Also, patch antennas may be specially designed to have a sharp drop off in gain at other particular angles. The antennas constitute a pair (in other words, the antennas are paired), when the two antennas are directed in substantially different directions such that in the direction of the half angle between the antennas' orientation the angular gradient of the gains of the antennas is maximum. Because of the typical cylindrical symmetry of patch antennas, when the antennas are paired, there is naturally no gain overlap except within and near to the common plane of the axes (common central plane) of the paired antennas. This is important in those instances when the beacon is not within this plane or close to it. If this condition does not hold for a particular make and/or model of antennas, one can add additional features to their mounting to ensure that the no gain overlap substantially outside of the common central plane is fulfilled. - The invention described herein uses as an example radio frequency (RF) radiation being employed by the beacon. While this choice implies the use of particular equipment (RF antennas, etc.) it is not implied that using other type of radiation is not within the scope of the present invention. For example, infrared (IR) and ultrasonic radiation sources and detectors are available and it is a matter of technological detail and choice as to which type of radiation is best to employ. The wavelength choice within the RF band is of some significance; free availability of off the shelf equipment that does not require additional certification may be balanced by the desire of choosing wavelengths at which reflection effects (multipath errors) are minimal.
- Additional mounting and shielding techniques may be used to cause the gain of the antenna to drop sharply at a particular angle. For example, mechanical barriers may be used such as extending the printed circuit board (“PCB”) on which a patch antenna is mounted to block signals past a desired angle (e.g., 90 degrees). In this arrangement, the directionality of the antenna is not important. That is to say, it is not important in the same way as in the case of the example shown in
FIG. 2 where a narrow maximum gain is necessary in the direction of the antenna. Here, on the contrary, the sharp drop-off of the gain along a particular direction is the requirement. The important feature is that the antenna's gain drops sharply as the signal's angle of incidence passes a certain angle. -
FIG. 4 is schematic representation of an antenna arrangement of an automated cooperative object tracking system in relation to a beacon according to a preferred embodiment of the present invention.FIG. 4 shows a top view of a preferred embodiment of automated cooperativeobject tracking system 100. In the preferred embodiment ofFIG. 4 , automated cooperativeobject tracking system 100 is used to orientcamera 25 at abeacon 60.Beacon 60 is also a radiation transmitter. Automatedcooperative tracking system 100 comprises a pair ofpatch antennas unit 10 that can turn about an axis A (axis A is perpendicular to the plane of the drawing).Antennas PCBs Antenna 20 receives signal frombeacon 60 at an angle □1 that is less than the drop off angle whileantenna 30 receives the signal from an angle □2 that exceeds the drop off angle. As a result,antenna 20 receives a stronger signal thanantenna 30 and a processing unit/microcontroller (not shown) compares the gain levels of the received signals by each antenna using, for example, the received signal strength indicator (RSSI) method. Depending on which antenna had the higher gain, the microcontroller will direct a turning mechanism (not shown) to turn panningunit 10 about axis A and rotate panningunit 10 in an attempt to keep the gain levels of the two antennas the same, i.e., in a direction that minimizes the difference between the signal intensities detected byantenna 20 and antenna 30 (in counterclockwise direction in the case of the example ofFIG. 4 ). -
FIG. 5 is a schematic depiction of the angular distribution of gains of an antenna pair according to a preferred embodiment of the present invention.FIG. 5 illustrates the operation of the automated cooperative tracking system ofFIG. 4 further. The orientation of the antennas is tied to the orientation of thecamera 25. For clarity, the antennas, PCBs, etc., are not shown inFIG. 5 . The orientation of the optical axis ofcamera 25 is defined as 0 degrees. When the incoming radiation reaches the antennas of the automated cooperative tracking system at an angle that is less than 0 degrees, the left side antenna will have a gain (Wleft) that is greater than the gain of the right side antenna (Wright), and the panning unit withcamera 25 attached turns to the left. In other words, the panning unit turns in the direction of the antenna with the higher gain. Note that the turning is described here with the antennas being fixed on the panning mechanism. Thus, to turn the camera toward the beacon, the antenna with the higher gain turns away from the beacon and the antenna with the lower gain turns toward the beacon. The turning stops when the gains on both antennas are equal. Thus, the relationship between gain and the turning of the camera can be generally stated as follows: If the left antenna gain is greater than the right antenna gain, then the camera is rotated to the left; if the right antenna gain is greater than the left antenna gain, then the camera is rotated to the right. - The two antennas may have slightly different gain patterns. In such a situation, a calibration procedure solves this issue. The calibration may be done by placing a transmitter directly in front of the panning unit (at 0 degrees, by definition) and taking measurements of the received signal strength of each antenna. In an ideal situation, the two antennas would receive the transmissions with equal strength when the transmitter is directly in front of the panning unit. Practically, however, the strength of the received signal may be different between the two antennas when the transmitter is directly in front of the panning unit. Using the calibration values accounts for the differences in gain patterns.
- The antenna arrangement illustrated in
FIG. 4 andFIG. 5 has the following deficiency: if the beacon is in the space hemisphere that is behind the camera (for example, close to 180 degrees) and there is a similar crossover of the antenna gains at another degree, for example at 180 degrees, then the panning device will turn the camera to this direction, which is not (and may be even the opposite of) the camera direction. This is not a problem if the system is always used for following targets that are in the front hemisphere with respect to the camera. In a preferred embodiment of the present invention, this problem is resolved by using three or more antennas in the same plane. This arrangement is illustrated byFIG. 6 . -
FIG. 6 is a schematic depiction of the angular distribution of gains of three antennas according to another preferred embodiment of the present invention. InFIG. 6 the three antennas are shown schematically having similar gain patterns that provide relatively high gains within a window of about 150 degrees. The gain windows have sharp edges, where the gain falls off andAntenna 1 andAntenna 2 are mounted such that their overlapping edges are atangle 0 degrees, which is the direction of the camera.Antenna 3 is mounted such that it has relatively wide overlaps both withAntenna 1 andAntenna 2. - To explain the operation of this antenna arrangement of
FIG. 6 , it is useful to consider three beacon positions. RegardingBeacon 1, the orientation process is no different from that discussed with respect toFIG. 5 above. Since the signal onAntenna 1 is stronger than onAntenna 2, a motor associated with the tracking unit will turn the tracking unit to the left and this process will continue until the signals onAntenna 1 andAntenna 2 are equal. - Regarding
Beacon 2, bothAntenna 2 andAntenna 3 register signal fromBeacon 2. The tracking unit will turn to the right and this process will continue untilAntenna 1 begins to registerBeacon 2 and further until the signals onAntenna 1 andAntenna 2 are equal. - Regarding
Beacon 3, OnlyAntenna 3 registers signal fromBeacon 3. The tracking unit will begin to turn in a preprogrammed direction (either to the right, or to the left). Eventually,Beacon 3 will register on Antenna 1 (if the unit is turning to the left) or on Antenna 2 (if the unit is turning to the right). The turning will continue untilBeacon 3 registers on bothAntenna 1 andAntenna 2 and further until the signals onAntenna 1 andAntenna 2 are equal. - In a slight modification, the tracking unit of the automated cooperative tracking system may be programmed to sense and register whether the gain of
Antenna 3 increases or decreases after the tracking unit first starts turning. If the gain increases, the tracking unit reverses its turning direction, but if the gain decreases, the tracking unit will keep turning in the initial direction. This modification may decrease the time that elapses between first registeringBeacon 3 and finally having the camera oriented at this beacon. - It is important to realize that the antenna arrangement of the automated cooperative tracking system of the present invention is not limited to three antennas. One may use four or more antennas that may have narrower gain windows, the advantage being that such antennas may have higher gains. If there are N antennas numbered from 1 to N from left to right and a beacon registers on antenna M, then, if M <½N, the tracking unit will turn to right, but if M>½N, it will turn to left. In either case the tracking unit will keep turning until both
Antenna 1 and Antenna N register signal from the beacon and then until the signal registered is equal on both antennas. In the remainder of this disclosure we will refer to such antenna systems as Paired Broad Gain Antenna systems, or PBGA systems, irrespective of the number (two or more) of antennas that are actually in the system. It should be noted that to guide a tracking system both in pan and tilt directions, two PBGA systems must be employed preferably arranged orthogonally (but orthogonal arrangement is not necessary). Further, recognizing that in asingle system Antenna 1 and Antenna N constitute the antennas that make it a paired antenna system, we will use the term “paired antennas” or “antenna pairs” to describe these two antennas within a PBGA system. -
FIG. 7 is a graph depicting the change of the angular location of a beacon orbiting a pointing device as detected according to the inventive method hereof and as deduced from GPS location detection data. The graph ofFIG. 7 shows experimental data regarded as proof of concept for the automated cooperative object tracking system and method of the present invention, To obtain the data shown inFIG. 7 , a beacon was equipped with both a radio transmitter and a GPS antenna. A pair of off-the-shelf TO-Link® TL-ANT2409A antennas were used as the receivers, in keeping with the arrangement illustrated inFIG. 4 . The antennas were stationary and the beacon was moved around to position it at different angles with respect to the antennas. The angular positions of the beacon were recorded using the GPS signal and are shown as black squares. The apparent angular positions of the beacon were also measured using the inventive method and are shown as empty circles inFIG. 7 . Considering that there were no measures taken to filter out electronic and other noise, or to optimize the apparatus in any respect, the data show an acceptable degree of agreement between the angular positions deduced from GPS locating and those obtained using the inventive apparatus and method. - To orient a pointing device at a source of radiation in a three-dimensional space, one has to turn the pointing device about two axes that may be referred to as pan and tilt axes. The example illustrated in
FIG. 4 may be regarded as either the pan portion or the tilt portion of such a pan and tilt apparatus. In one embodiment the tilt portion of the apparatus works essentially on the same principles as described in conjunction withFIG. 4 andFIG. 5 . However, there are a few more details useful to consider. To accomplish the task of orientation, one may use an apparatus comprising a panning unit mounted on a base, such as, for example, a tripod, and a tilting unit mounted on the panning unit. If the antennas for panning are mounted on the sides of the panning unit as described and shown above inFIG. 4 , then the incoming signal from a moving radiation source that is at times at a higher elevation and at other times is at a lower elevation arrives at the antennas at varying angles. In such a situation, the characteristics of the antennas may not be the same in all these directions. This may defeat the calibration described above and may lead to inaccuracies in pointing. To avoid this, it is preferable to mount both the pan and the tilt antennas on the tilt portion of the orienting apparatus. -
FIG. 8 is a perspective view of a pan-tilt mechanism of an automated cooperative object tracking system having two pairs of directional solid state antennas according to a preferred embodiment of the present invention.FIG. 8 illustrates one possible mounting arrangement of two patch antenna pairs on apan-tilt mechanism 200.Panning mechanism 210 may be mounted on a tripod or other base via a rotatable shaft that permits the rotation of panningmechanism 210 around axis A shown as a dashed-dotted line.Panning mechanism 210 is operationally coupled to tiltingmechanism 220.Tilting mechanism 220 is capable of rotation about axis B, also shown as a dashed-dotted line. A portion oftilting mechanism 220 isshaft 225 that serves as the mounting base forpatch antenna 230, situated on the left end ofshaft 225. Another patch antenna is similarly mounted on the right end ofshaft 225 but is not shown in the drawing.Bases hold patch antennas pan-tilt mechanism 200. In keeping with the inventive method hereof and described with the aid ofFIG. 4 , one of the patch antennas is mounted facing up (antenna 255) and the other is mounted facing down (antenna 245, shown using dashed lines). As noted above, more antennas may be used.Pan-tilt mechanism 200 is also equipped withomnidirectional antenna 260 used to receive and to send radio signals.Screw 270 is used to mount a camera or other pointing device onpan-tilt mechanism 200. - The RF tracking method of the present invention may a have a high sampling rate but noisy output. Therefore, filtering algorithms are preferably employed to smooth the tracking data.
- The signal emitted by beacon (e.g.,
beacon 60FIG. 4 ) preferably carries a code that makes it distinguishable from any otherwise similar background radiation. -
FIG. 9 is a schematic drawing of a wiring diagram of antennas and other components of an automated cooperative object tracking system according to a preferred embodiment of the present invention. In the preferred embodiment illustrated byFIG. 9 , all patch antennas are connected to a single amplifier and gain measurements are preferably carried out sequentially. InFIG. 9 , automated cooperativeobject tracking system 300 comprises fourpatch antennas non-directional antenna 350. All patch antennas are shown together withPCBs microcontroller 370.Microcontroller 370 may preferably be of the type known as “RX MCU”. As an example, the gains may be measured in the following order: left-facing antenna (antenna 345), right-facing antenna (antenna 335), up-facing antenna (antenna 325), and down-facing antenna (antenna 315, shown in dashed lines). This method is useful to minimize equipment cost, but it assumes that measurements may be carried out sufficiently fast. The system architecture shown and described inFIG. 9 may be modified such that only some antennas are grouped and queried in sequence. For example each PBGA system of a pan-tilt tracker may have its own grouping of antennas in which the antennas are read sequentially. - Generally, if the beacon moves with a velocity v in a direction perpendicular to the pointing vector R, wherein vector R points from the location of the pointing device to the beacon, then the angular velocity of the pointer must be □=v/R, where R is the length of vector R. If the beacon emits signals in time intervals □, then between two orientation readings the pointer must turn an angle □□□□. (The movement in the direction perpendicular to the pointing vector R is the worst case scenario; however this calculation neglects the time required to measure the gains and to translate such measurements into turning commands). Using the sequential gain measurement embodiment increases the effective time between readings by a factor of 4, to 4□□ (or, if
antenna 350 requires equal time, 5□). Whether this is acceptable for a particular purpose depends on the expected velocities of the beacons and on the other technical characteristics of the system. - It is noted that techniques exist (e.g., RF ranging and IR intensity measurement) that can yield information concerning the distance between the pointer and the beacon, i.e., the length R. Thus, R is assumed to be known. It is also noted that the combined knowledge of the direction between the pointer and the beacon and of the distance between them is sufficient to know their relative positions (relative locations). In a preferred embodiment of the present invention, cameras that follow or track a beacon are preferably automatically turned on and off based on the beacon's proximity to the camera, or to a particular camera if more than one camera is used.
- In a preferred embodiment of the present invention, the beacon is preferably tracked using multiple pointing devices and each pointing device itself may also be used as a beacon. When the distance between any two of the pointers is known, it is possible to determine the locations of all other pointers. This may be done during a setup procedure before actual tracking and filming starts using the systems and methods disclosed herein. Setup of such a system comprises determining angles between the directions of any second and third pointers with respect to a first pointer and then using geometrical calculations (triangulation). Once the locations of all pointers are known, the location of the beacon may be determined using orientation data from any two of the pointers. Thus, if multiple pointers can track the same beacon, the location of the beacon may be determined using multiple sets of independent data. This opens the possibility of using such data to determine the beacon's location with high certainty and to eliminate multipath effects. Eliminating multipath effects is particularly important when such a system is used indoors where walls can reflect radio waves.
- In another preferred embodiment of the present invention,
beacon 60 is also equipped with one or more inertial measurement units (IMUs). Locating/orienting techniques are prone to errors (e.g., multipath errors) and blocked signals. These location errors can be reduced using one or more IMUs on the beacon. The NU measures the beacon's accelerations, and the acceleration data can be transmitted to the orienting pan and tilt unit to be used by the microcontroller to supplement the measurements of position of the beacon. For example, a 3-axis accelerometer placed on the transmitter can send information relating to the magnitudes of accelerations experienced by the beacon (also called “motion indication values” [“MIVs”]). The filter settings on the pan and tilt unit may be adjusted in real time based on the MIVs received such that if an instantaneous location determination is far from the previous location indications, and the MIV is reporting low values of accelerations, it is likely the new signal is the result of a multi-path reflection. Such a signal should either be ignored or heavily filtered. In another example, using a 6-axis NU (meaning a 3-axis accelerometer and 3-axis gyroscope), an estimate for distance traveled over a short period of time may be calculated. However the direction the beacon moved would not be known. The transmitter associated with the beacon could tell the pan-tilt unit that it moved a determined number of meters in the last second but not the direction of this movement. The microcontroller would then compare the new location estimate (based, for example, on LOS reading and distance measurement) to the distance estimate and the previous location and make sure that the location reading is reasonable. - In another preferred embodiment of the invention a 9-axis IMU (3-axis accelerometer, 3-axis gyroscope, 3-axis magnetometer) is used. With a 9-axis IMU, an estimate for both distance and direction over a short period of time can be calculated. A data fusion algorithm combines the two pieces of information, for example if the COT method detects movement of one meter and the IMU data predicts the movement was two meters over the same timeframe, the two values are averaged and the pointing device is instructed to point at the averaged value. The microcontroller uses such fused information for improving the overall location determination (to either detect or filter out errors caused by multipath, or aid in extrapolation when LOS is not available due to signal blocking).
- In another preferred embodiment of the present invention,
beacon 60 preferably emits circularly polarized radiation and the patch antennas are preferably designed to detect such radiation. Radiation that does not reach the patch antennas directly, i.e., radiation that is reflected from nearby objects, such as walls, will have lost the correct polarization and will be detected less. - In another preferred embodiment of the present invention, when the system is employed for automated video recording, the microcontroller of the automated cooperative object tracking system may be programmed to recognize movement patterns that are characteristic of certain activities. Such recognition may then be used to control recording parameters. For example, when an exciting moment is about to occur (such as when a surfer about to catch a wave), it would be desirable to have the automatic recording device change to, for example, a high resolution and faster frame rate to get better quality footage of the surfer and wave, and then revert back to a lower quality resolution and/or slower frame rate after the exciting event. If the microcontroller is programmed to detect when an exciting event is about to happen, location and velocity data can be used to detect an exciting event. For example, there are recognizable characteristics of IMU sensor data that can be used to detect that the surfer is likely about to catch a wave. When such characteristics are detected, the microcontroller preferably triggers the camera to record at a higher recording resolution and/or frame rate. One example of recognizable data would be aggressive paddling motions detected by an IMU located on the surfers arm. Paddling into a wave by a surfer is much more aggressive when compared with other paddling scenarios and occurs only for a short period of time (about one to five seconds).
- Camera image stabilization algorithms that are optimized for stabilizing hand held video recording often do not work well (and are sometimes detrimental) when the camera is being pointed by an electromechanical camera aiming system since the motion characteristics of electromechanical camera aiming systems often are much different than those of human cameramen. In a preferred embodiment of the present invention, the microcontroller of the pan and tilt mechanism implements an image stabilization method that dampens high frequency vibrations but allows low frequency motions. This can be done using either mechanical or digital methods, or using a combination of both.
- In another preferred embodiment of the present invention, cooperative object tracking (COT) is combined with video recognition (VR). For example, the system starts tracking using a COT method (i.e., the subject holds a tracking beacon and IR, RF, GPS, etc., are used to track the beacon). The VR software automatically learns to recognize the subject that is being tracked and the system starts to track based on a combination of VR and COT. After some time, the system only needs VR to track the subject. In a variation of this method, the subject spends a certain amount of time using the beacon and “teaching” the system to operate based on VR, then the use of the beacon is discontinued. One advantage of this method is that the beacon may be used less requiring less battery power permitting use of a smaller size beacon. A further advantage of this method is that the subject can become free from having to carry the beacon. In another embodiment, after the system “learns” to recognize the subject, the system “learns” to recognize other subjects of the same “class”. Here “class” signifies, for example, a group of people engaged in similar activity (e.g., riding bicycles, surfing, etc.), or wearing similar clothing (e.g., uniforms as on a soccer team). Systems using class tracking may include a multiplicity of tracking cameras in a network that would follow members of the class based on their location (e.g., being within the soccer field, in the water, etc.), or other criteria (e.g., velocity, distance from the camera, etc.). Thus, the network of tracking cameras uses cooperative object tracking to start, but eventually uses the network's improved pixel tracking to track objects or events that are not being tracked with cooperative object tracking with the system.
-
FIG. 10 is a flow diagram of a method of orientation according to a preferred embodiment of the present invention. Regarding such method, it is assumed that the tracking system is oriented nearly at the object such that the radiation signal emitted by the beacon is registered by both antennas of each antenna pair mounted on orthogonal planes. A beacon associated with the object to be tracked periodically transmits a signal instep 410. The signal preferably incorporates a unique identifier of the beacon. The signal is preferably a radio wave, but it may be electromagnetic radiation from a different part of the electromagnetic spectrum (e.g., infrared light) or a sound wave (e.g., ultrasound). The signal transmitted from the beacon is detected by appropriate antennas associated with a pointing device, such as a video camera, having a pan-tilt mechanism, instep 420. The antennas comprise two orthogonally arranged Paired Broad Gain Antenna (PBGA) systems. The gain of each paired antenna is determined instep 430. For each antenna pair the gain difference is determined algebraically, i.e., both the sign (i.e., which antenna has the higher gain) and the magnitude of the gain difference are determined instep 440. Instep 450, the system determines if the gain difference is nearly zero or not. If the gain difference is nearly zero, no action is taken (step 460). If the gain difference is not nearly zero, then the method proceeds to step 470. Instep 470, the pan-tilt mechanism of the pointing device turns in the direction of the higher gain of each antenna pair. That is, the gain difference detected within the panning plane causes the mechanism to pan if that difference is not nearly zero and the gain difference detected in the tilt plane causes the mechanism to tilt if that difference is not nearly zero. The definition of “nearly zero” is equivalent to defining a deadband that is useful to avoid jittery reaction to minute changes in the detected antenna gains. Such minute changes may be due to minute changes in the position of the object tracked or may be due to system noise. Instep 480, turning velocity is controlled in part based on the magnitude of the gain difference detected between each pair of antennas. This is based on the recognition that when there is a higher difference in gain, the object orientation is farther from the current orientation of the pointer. - When orienting a camera at a moving object, like a person carrying a beacon, commands for orienting the camera have to be provided to the orienting mechanism with sufficient frequency. This frequency is, however still relatively low, in the range of three to 10 Hz in most cases. Also, a beacon associated with a person (for example, as a device attached to an armband) may move in such a way that the radiation intensity emitted in the direction of the receiving antennas may change rapidly. To avoid errors associated with such changes between the measurements of the gains of the antenna pair, one may want to do measurements in very quick succession, for example at a rate of 100 Hz. Using this as an example, one would generate 50 gain difference readings per second, but orientation commands to the pan-tilt mechanism that orient the pointing device/camera need to be provided only at a rate of about 5 Hz, for example. That would allow for an averaging of 10 gain difference readings before each turning command and a corresponding improvement of signal-to-noise ratio. Alternatively, Kalman filtering may also be employed.
- In another preferred embodiment of the present invention, the tracking apparatus is a part of a system that comprises other similar apparatuses (automated cameramen) and also a human cameraman, or more generally, a human user in control of the system. The human user may or may not be a part of the action that is being filmed. The human user can determine when the action is worth recording or photographing. A problem with an automated cameramen is that they do not have the same intelligence as a human to determine when to start or stop filming. For example, automated cameramen will not inherently respond to a director yelling “action”. The one or more automated cameramen in the present embodiment start or stop recording (or take photos) based on the remote control of a human. The remote control is activated when the human user is a cameraman and uses their camera such that the automated cameramen mimics what the human cameraman is doing. For example, when the human cameraman begins recording, a signal is sent to all the automated cameramen which also start recording.
- In a yet another preferred embodiment of the present invention multiple tracking devices are interconnected to form a system wherein each tracking device utilizes data from all devices in a machine learning protocol to learn from each other so that tracking uses location and biometric data compared with pixel tracking and all the other types of tracking to further optimize the tracking and video recording process. As a result, a network of tracking devices will act as an array of eyes and ears that automatically senses and alerts to problems, Examples are: (a) a camera “sees” in its peripheral that a car accident occurred and automatically dials 911; (b) a camera that is part of a municipal network “sees” a police officer in trouble and alerts the police department; (c) a camera is following a surfer but senses a drowning victim in the background, or even recognizes whale breaching, and alerts the network.
- In yet another preferred embodiment of the present invention, the tracking apparatus is used to recognize events. For example, a sporting event can be determined based on the types of motions detected (e.g., youth soccer). The inventive system of tracking devices may also preferably connect with a cellular company (such as Verizon) to detect their customers in the vicinity of an event based on user's cell phone location data. The company can reasonably assume that those customers are also interested in that type of event (e.g., youth soccer), Then anyone with a Verizon phone in the vicinity may be flagged as being interested in youth soccer and targeted for related advertising.
- During the cooperative object tracking, video recognition algorithms may also be used for collecting additional customer data.
-
FIG. 11 is a flow diagram of an automated editing and publishing method of the footage recorded by the inventive system. A problem of uploading video footage to a remotely located editing service is that video footage files are large and time consuming to transfer. An inventive way to solve this problem is illustrated usingFIG. 11 . In this method the user films, using the inventive system and method hereof, footage in high resolution instep 500. The footage is saved on the user's device instep 510. The user's device creates a low resolution version of the footage instep 520, and uploads it to a remotely located server (computer) of an editing service instep 530. The editing service then edits the low resolution footage into “appealing video clips” instep 540. The editing software that the editing service uses then creates a set of specific editing instructions that are to be implemented on the high resolution version on the user's device instep 550. The instructions are sent to the user and are implemented on the user's device that has editing software capable of following the editing instructions generated instep 550 and replicates the appealing video clips(s) on the user's device using the high resolution video files (step 560), The result may be reviewed by the user instep 570. The user decides whether the edit is “good” or not instep 580. If not, the editing process may be repeated by returning to step 540. At this step, the user may provide editing suggestions, instructions, and the like to make the second round of editing improved. If the user is satisfied with the edit, the user uses the high resolution edited footage instep 590. For example, the user may upload the edited clip to a server for viewing by others. - One benefit of the method of
FIG. 11 is that large, high resolution video foes never needed to be uploaded for editing. The remotely located video editing service may be an automated service or a service performed by humans. Step 580 is optional as noted by the dashedline connecting step 570 withstep 590. In a preferred embodiment, such automated editing service collects feedback data to improve its editing capabilities. As more people use the service and provide feedback, the quality of the editing service improves. The feedback could be based on users modifying the edited video clips on their own computers software. Data on the modifications which the user made is sent as feedback to the editing service to improve their editing algorithms. - Applicant hereby incorporates by reference in their entirety the following co-owned patent applications which may assist in understanding the present invention: U.S. patent application Ser. No. 13/801,336, titled “System and Method for Video Recording and Webcasting Sporting Events”, PCT International Patent Application No. PCT/US2013/041187, titled “High Quality Video Sharing Systems”, PCT International Patent Application No. PCT/US2013/070903, titled “Automatic Cameraman, Automatic Recording System and Automatic Recording Network”, and U.S. patent application Ser. No. 14/600,177, titled “Neural Network for Video Editing”.
- Different embodiments, features and methods of the invention are described with the aid of the figures, however the particular described embodiments, features and methods should not be construed as being the only ones that constitute the practice of the invention and the described embodiments, features and methods are in no way substitutes for the broadest interpretation of the invention as claimed.
Claims (5)
1. A method of automated tracking and filming of an object, the method comprising:
emitting a signal via a beacon that is associated with the object;
detecting the signal via a first tracking mechanism at a first location,
the first tracking mechanism comprising:
a first wide angle antenna,
a second wide angle antenna,
a camera, and
a microcontroller, and
the signal is detected by each of the first wide angle antenna and the second wide angle antenna;
comparing, via the microcontroller, an intensity of the signal detected by the first wide angle antenna and an intensity of the signal detected by the second wide angle antenna; and
controlling, via the microcontroller, the first tracking mechanism to turn the first tracking mechanism such that a difference between the intensity of the signal detected by the first wide angle antenna and the intensity of the signal detected by the second wide angle antenna is diminished;
recording and saving a pointing direction of the camera based on the difference, between the intensity of the signal detected by the first wide angle antenna and the intensity of the signal detected by the second wide angle antenna, being substantially zero; and
recording footage of the object via the camera.
2. The method of claim 1 , further comprising:
operating a plurality of tracking mechanisms based on control of the first tracking mechanism, wherein the plurality of tracking mechanisms include a plurality of cameras;
recording and saving pointing directions of the plurality of cameras, wherein
each of the plurality of tracking mechanisms is located at a different location,
distances between the plurality of different locations are pre-defined,
directions between the plurality of different locations are pre-defined,
each of the plurality of tracking mechanisms is substantially similar to the first tracking mechanism,
the pointing directions are recorded and saved based on a difference, between signal intensities detected by antennas of the plurality of tracking mechanisms, being substantially zero;
calculating locations of the object based on:
the pointing directions,
the distances between the plurality of different locations, and
the directions between the plurality of different locations; and
recording footage of the object via the plurality of cameras.
3. The method of claim 1 , further comprising:
transmitting, based on a circularly polarized radiation, the signal emitted by the beacon; and
detecting the circularly polarized radiation based on patch antennas that correspond to the first wide angle antenna and the second wide angle antenna.
4. The method of claim 2 , further comprising:
starting or stopping a video recording process; and
capturing still photographs based on a remote control.
4. The method of claim 2 , further comprising:
analyzing the recorded footage of the object by executing an image recognition software;
executing image recognition of the object based on machine learning methods; and
tracking the object based on the execution of the image recognition of the object.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/720,086 US20220236359A1 (en) | 2014-02-24 | 2022-04-13 | Cooperative automatic tracking |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201461943903P | 2014-02-24 | 2014-02-24 | |
US14/630,524 US10241186B2 (en) | 2014-02-24 | 2015-02-24 | Cooperative automatic tracking |
US16/357,115 US20190219656A1 (en) | 2014-02-24 | 2019-03-18 | Cooperative automatic tracking |
US17/720,086 US20220236359A1 (en) | 2014-02-24 | 2022-04-13 | Cooperative automatic tracking |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/357,115 Continuation US20190219656A1 (en) | 2014-02-24 | 2019-03-18 | Cooperative automatic tracking |
Publications (1)
Publication Number | Publication Date |
---|---|
US20220236359A1 true US20220236359A1 (en) | 2022-07-28 |
Family
ID=53879146
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/630,524 Active 2037-06-26 US10241186B2 (en) | 2014-02-24 | 2015-02-24 | Cooperative automatic tracking |
US16/357,115 Abandoned US20190219656A1 (en) | 2014-02-24 | 2019-03-18 | Cooperative automatic tracking |
US17/720,086 Pending US20220236359A1 (en) | 2014-02-24 | 2022-04-13 | Cooperative automatic tracking |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/630,524 Active 2037-06-26 US10241186B2 (en) | 2014-02-24 | 2015-02-24 | Cooperative automatic tracking |
US16/357,115 Abandoned US20190219656A1 (en) | 2014-02-24 | 2019-03-18 | Cooperative automatic tracking |
Country Status (2)
Country | Link |
---|---|
US (3) | US10241186B2 (en) |
WO (1) | WO2015127440A1 (en) |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10291725B2 (en) | 2012-11-21 | 2019-05-14 | H4 Engineering, Inc. | Automatic cameraman, automatic recording system and automatic recording network |
WO2016151925A1 (en) * | 2015-03-26 | 2016-09-29 | 富士フイルム株式会社 | Tracking control device, tracking control method, tracking control program, and automatic tracking/image-capturing system |
US10728488B2 (en) | 2015-07-03 | 2020-07-28 | H4 Engineering, Inc. | Tracking camera network |
WO2017197174A1 (en) | 2016-05-11 | 2017-11-16 | H4 Engineering, Inc. | Apparatus and method for automatically orienting a camera at a target |
US10742494B2 (en) * | 2017-04-27 | 2020-08-11 | Veoneer Us, Inc. | System and method for configuring at least one sensor system of a vehicle |
CN107920232B (en) * | 2017-11-28 | 2020-05-15 | 江苏如是地球空间信息科技有限公司 | Intelligent tracking camera system based on GPS positioning coordinate resolving |
US10805012B1 (en) * | 2019-11-27 | 2020-10-13 | NortonLifeLock, Inc. | Systems and methods for protecting users |
JP2022175287A (en) * | 2021-05-13 | 2022-11-25 | キヤノン株式会社 | Imaging apparatus, control method thereof, and program |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US2885666A (en) * | 1957-06-12 | 1959-05-05 | Oscar J Snow | Automatic tracking system for a radar or microwave receiving system |
US4771288A (en) * | 1985-05-17 | 1988-09-13 | Motorola, Inc. | Centroid detection apparatus and method therefor |
US6181271B1 (en) * | 1997-08-29 | 2001-01-30 | Kabushiki Kaisha Toshiba | Target locating system and approach guidance system |
US6710713B1 (en) * | 2002-05-17 | 2004-03-23 | Tom Russo | Method and apparatus for evaluating athletes in competition |
US20050228613A1 (en) * | 2004-04-12 | 2005-10-13 | Time Domain Corporation | Method and system for extensible position location |
US20140313345A1 (en) * | 2012-11-08 | 2014-10-23 | Ornicept, Inc. | Flying object visual identification system |
US20150169968A1 (en) * | 2013-12-18 | 2015-06-18 | Johnson Controls Technology Company | In-vehicle camera and alert systems |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3339358B2 (en) * | 1997-05-09 | 2002-10-28 | 三菱電機株式会社 | Antenna control device |
IL144936A0 (en) * | 1999-02-18 | 2002-06-30 | Tno | Monopulse phased array system |
US6323817B1 (en) * | 2000-01-19 | 2001-11-27 | Hughes Electronics Corporation | Antenna cluster configuration for wide-angle coverage |
US6433736B1 (en) * | 2000-11-22 | 2002-08-13 | L-3 Communications Corp. | Method and apparatus for an improved antenna tracking system mounted on an unstable platform |
EP1530255A1 (en) * | 2003-11-07 | 2005-05-11 | Matsushita Electric Industrial Co., Ltd. | Adaptive antenna apparatus provided with a plurality of pairs of bidirectional antennas |
EP1772034B1 (en) * | 2004-06-30 | 2015-08-12 | Telefonaktiebolaget L M Ericsson (publ) | Antenna beam shape optimization |
TW200735457A (en) * | 2006-03-14 | 2007-09-16 | Mitac Technology Corp | Antenna having the member to regulate the pattern of radiation |
US7724188B2 (en) * | 2008-05-23 | 2010-05-25 | The Boeing Company | Gimbal system angle compensation |
US8138988B2 (en) * | 2008-07-30 | 2012-03-20 | AlpZite, LLC | Object locating apparatus, system and high-gain, high-selectivity antennae therefor |
US20110025464A1 (en) * | 2009-07-30 | 2011-02-03 | Awarepoint Corporation | Antenna Diversity For Wireless Tracking System And Method |
TW201208335A (en) * | 2010-08-10 | 2012-02-16 | Hon Hai Prec Ind Co Ltd | Electronic device |
US9055226B2 (en) * | 2010-08-31 | 2015-06-09 | Cast Group Of Companies Inc. | System and method for controlling fixtures based on tracking data |
WO2013131036A1 (en) * | 2012-03-01 | 2013-09-06 | H4 Engineering, Inc. | Apparatus and method for automatic video recording |
EP2870748A4 (en) * | 2012-07-06 | 2016-03-09 | H4 Eng Inc | A remotely controlled automatic camera tracking system |
-
2015
- 2015-02-24 US US14/630,524 patent/US10241186B2/en active Active
- 2015-02-24 WO PCT/US2015/017304 patent/WO2015127440A1/en active Application Filing
-
2019
- 2019-03-18 US US16/357,115 patent/US20190219656A1/en not_active Abandoned
-
2022
- 2022-04-13 US US17/720,086 patent/US20220236359A1/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US2885666A (en) * | 1957-06-12 | 1959-05-05 | Oscar J Snow | Automatic tracking system for a radar or microwave receiving system |
US4771288A (en) * | 1985-05-17 | 1988-09-13 | Motorola, Inc. | Centroid detection apparatus and method therefor |
US6181271B1 (en) * | 1997-08-29 | 2001-01-30 | Kabushiki Kaisha Toshiba | Target locating system and approach guidance system |
US6710713B1 (en) * | 2002-05-17 | 2004-03-23 | Tom Russo | Method and apparatus for evaluating athletes in competition |
US20050228613A1 (en) * | 2004-04-12 | 2005-10-13 | Time Domain Corporation | Method and system for extensible position location |
US20140313345A1 (en) * | 2012-11-08 | 2014-10-23 | Ornicept, Inc. | Flying object visual identification system |
US20150169968A1 (en) * | 2013-12-18 | 2015-06-18 | Johnson Controls Technology Company | In-vehicle camera and alert systems |
Also Published As
Publication number | Publication date |
---|---|
US20150241546A1 (en) | 2015-08-27 |
US10241186B2 (en) | 2019-03-26 |
WO2015127440A1 (en) | 2015-08-27 |
US20190219656A1 (en) | 2019-07-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20220236359A1 (en) | Cooperative automatic tracking | |
US11300650B2 (en) | Apparatus and method for automatically orienting a camera at a target | |
JP6227553B2 (en) | Portable system for high quality automatic recording | |
US9800769B2 (en) | Apparatus and method for automatic video recording | |
US9497388B2 (en) | Zooming factor computation | |
US8508595B2 (en) | Surveillance camera system for controlling cameras using position and orientation of the cameras and position information of a detected object | |
US7738008B1 (en) | Infrared security system and method | |
US20160127695A1 (en) | Method and apparatus for controlling a camera's field of view | |
JP2015109641A (en) | System for following object marked by tag device with camera | |
WO2018217260A3 (en) | Systems and methods for tracking and controlling a mobile camera to image objects of interest | |
US10659679B1 (en) | Facial location determination | |
US20090304374A1 (en) | Device for tracking a moving object | |
WO2018076573A1 (en) | Image acquisition method, electronic device, and storage medium | |
US10938102B2 (en) | Search track acquire react system (STARS) drone integrated acquisition tracker (DIAT) | |
US10165172B2 (en) | Camera system and control method therefor, and electronic device and control program therefor | |
US20060171705A1 (en) | Automatic tracking system | |
KR20200023974A (en) | Method and apparatus for synchronization of rotating lidar and multiple cameras | |
US20140111653A1 (en) | Method and system for the tracking of a moving object by a tracking device | |
EP3255450A1 (en) | A positioning arrangement | |
KR20110078655A (en) | A method and a system of filming using multiple camera | |
CN113784041A (en) | Automatic tracking photography holder and method based on UWB | |
KR101515472B1 (en) | Apparatus and system for tracing and photographing electric power facility | |
KR101589470B1 (en) | Security monitoring system and method using ultra wideband through-the-wall radar and image sensor | |
US20240114247A1 (en) | Remotely-operable trail camera with zoom, pan & tilt | |
CN116913041A (en) | Vision monitoring and early warning method and system based on Beidou |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: H4 ENGINEERING, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BOYLE, CHRISTOPHER T;TAYLOR, SCOTT K;SAMMONS, ALEXANDER G;AND OTHERS;SIGNING DATES FROM 20140422 TO 20140426;REEL/FRAME:059590/0050 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |