WO2018205133A1 - Direction finding of wireless communication devices - Google Patents

Direction finding of wireless communication devices Download PDF

Info

Publication number
WO2018205133A1
WO2018205133A1 PCT/CN2017/083588 CN2017083588W WO2018205133A1 WO 2018205133 A1 WO2018205133 A1 WO 2018205133A1 CN 2017083588 W CN2017083588 W CN 2017083588W WO 2018205133 A1 WO2018205133 A1 WO 2018205133A1
Authority
WO
WIPO (PCT)
Prior art keywords
antenna
peripheral
wireless signal
target object
wireless
Prior art date
Application number
PCT/CN2017/083588
Other languages
French (fr)
Inventor
Guobin Shen
Zheng Han
Original Assignee
Zepp Labs, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zepp Labs, Inc. filed Critical Zepp Labs, Inc.
Priority to PCT/CN2017/083588 priority Critical patent/WO2018205133A1/en
Publication of WO2018205133A1 publication Critical patent/WO2018205133A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/292Multi-camera tracking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30221Sports video; Sports image

Definitions

  • This disclosure relates generally to digital content processing and particularly to capturing video highlights in sports videos by tracking the position of a target object and synchronizing a video capturing device (e.g., a mobile phone) on a rotary station with the movement of the target object.
  • a video capturing device e.g., a mobile phone
  • a video highlight of a sports video is a portion of the sports video and represents a semantically important event captured in the sports video (e.g., a short video clip capturing goals or goal attempts in a soccer game video) .
  • Embodiments of the disclosure provide an object tracking device for tracking the movement of a target object.
  • the object tracking device includes a direction finding subsystem and a steering subsystem.
  • the direction finding subsystem determines the direction of the target object.
  • the direction finding subsystem includes multiple peripheral antennas, a reference antenna, an antenna switch coupled to the peripheral antennas, a first wireless radio, a second wireless radio, and a processing unit.
  • the antenna switch selects one of the plurality of peripheral antennas.
  • the first wireless radio receives a wireless signal from the target object via a peripheral antenna selected by the antenna switch.
  • the second wireless radio receives the wireless signal from the target object via the reference antenna.
  • the processing unit controls the antenna switch and determines the direction of the target object based on signals sampled by the first wireless radio and the second wireless radio.
  • the steering subsystem includes a holder, one or more motors, and a control unit.
  • the holder is configured to be attached to a video capturing device (e.g., a video camera) .
  • the one or more motors are configured to rotate the holder.
  • the control unit receives the direction of the target object determined by the direction finding subsystem and controls the one or more motors based on the received direction of the target object.
  • Embodiments also include a method for determining the direction of a target object.
  • a first wireless signal is sampled using a first peripheral antenna of a plurality of peripheral antennas, and a reference antenna.
  • a phase difference between the first wireless signal sampled by the first peripheral antenna and the first wireless signal sampled by the reference antenna is determined.
  • a second wireless signal is sampled using a second peripheral antenna of the plurality of peripheral antennas, and the reference antenna.
  • a phase difference between the second wireless signal sampled by the second peripheral antenna and the second wireless signal sampled by the reference antenna is determined.
  • the direction of the target object is determined based on (1) the phase difference between the first wireless signal sampled by the first peripheral antenna and the first wireless signal sampled by the reference antenna, and (2) the phase difference between the second wireless signal sampled by the second peripheral antenna and the second wireless signal sampled by the reference antenna. More such phase differences can be similarly obtained using wireless signal samples from additional peripheral antennas against the same reference antenna, and incorporated into the determination process of the target object
  • FIG. 1 is a block diagram of an environment for tracking a wireless communication device and synchronizing a video capturing device (e.g. a mobile phone) connected to a rotary station with the movement of the wireless communication device, according to one embodiment.
  • a video capturing device e.g. a mobile phone
  • FIG. 2A illustrates a block diagram of the rotary system, according to one embodiment.
  • FIG. 2B illustrates a block diagram of the rotary system, according to another embodiment.
  • FIG. 3A illustrates a block diagram of the camera steering &servo subsystem, according to one embodiment.
  • FIG. 3B illustrates a block diagram of the BLE direction finding subsystem, according to one embodiment.
  • FIG. 4A illustrates a top and side view of an antenna array arranged in a straight line, according to one embodiment.
  • FIG. 4B illustrates a top and side view of an antenna array arranged in a circular pattern, according to one embodiment.
  • FIG. 5A illustrates a uniform linear antenna array, according to one embodiment.
  • FIG. 5B illustrates a circular antenna array, according to one embodiment.
  • FIG. 6 illustrates a flow diagram of a method for determining the direction of a target object using an antenna switching pattern, according to one embodiment.
  • FIG. 7 shows a circular antenna array with two wireless radios, according to one embodiment.
  • FIG. 8 illustrates a flow diagram of a method for determining the direction of a target object using two wireless radios, according to one embodiment.
  • FIG. 9 illustrates a diagram for reducing a multipath effect of signals on a circular antenna array, according to one embodiment.
  • FIG. 10 illustrates a diagram of the field of view (FoV) for deciding whether to rotate the video capturing device, according to one embodiment.
  • FIG. 11 illustrates a diagram of two rotary systems tracking an object moving from a first position to a second position, according to one embodiment.
  • a solution is provided to conveniently capture video highlights in a sports event by tracking movement of a target object and synchronizing rotations of a video capturing device (e.g., a mobile phone) on a rotary station with the movement of the target object.
  • the target object includes a wireless module that transmits a wireless signal that can be used to determine the position of the target object.
  • Position data describing the position of the target object, or motion data describing the movement of the target object is transmitted to the rotary station, and processed to generate commands to drive the video capturing device (e.g., a mobile phone) mounted on the rotary station, to follow the movement of the target object.
  • digital content generally refers to any machine-readable and machine-storable work.
  • Digital content can include, for example, a combination of video and audio.
  • digital content can be a still image.
  • the digital content will be referred to as a “video” but no limitation on the type of digital content that can be analyzed are intended by this terminology.
  • FIG. 1 is a block diagram of an environment 100 for tracking a wireless communication device and synchronizing a video capturing device (e.g. a mobile phone) connected to a rotary station with the movement of the wireless communication device, according to one embodiment.
  • the embodiment illustrated in FIG. 1 includes a target 160 that transmits a wireless signal using one or more wireless communication protocols (e.g., Bluetooth or Wi-Fi) , a video capturing device 105 mounted on a rotary system 110.
  • the rotary system 110 is further mounted on a tripod 150. If the video capturing device 105 is recording a sports game, the target 160 may be inside a sports field 120.
  • the target 160 includes a wireless module that transmits a wireless signal using a wireless communication protocol.
  • the target 160 transmits a data packet using the wireless communication protocol such that the encoded wireless signal has a specific characteristic (e.g., a sinusoidal signal with a single frequency component) .
  • a specific characteristic e.g., a sinusoidal signal with a single frequency component
  • a data packet may be generated so that the encoded wireless signal is a sequence of zeros or a sequence of ones.
  • the Bluetooth protocol is used throughout, the present disclosure is not limited thereto.
  • other wireless protocols such as Wi-Fi, radio-frequency identification (RFID) or Zigbee may be used instead.
  • the video capturing device 105 is an electronic device used to capture digital content such as recoding a video clip of a sports event happening on the sports field 120.
  • the video capturing device 105 is a mobile phone and includes an image capturing device and a transmitter for transmitting the captured digital content.
  • the video capturing device 105 can be, for example, a smart phone, a tablet, a digital single-lens reflex camera (DSLR) , or any other suitable user device or computing device for capturing digital content.
  • DSLR digital single-lens reflex camera
  • the video capturing device 106 can be remotely triggered by a remote control to capture digital content.
  • the video capturing device 105 captures digital content for a predetermined period of time (e.g., 30 seconds) . In other configurations, the video capturing device 105 begins capturing digital content when remotely triggered by the remote control and ends capturing digital content when again remotely triggered by the remote control. In addition, the remote control may have certain function buttons to tag the content being captured, and have certain sensors to incorporate rich content tags such a voice tag. The video capturing device 105 can transmit the captured digital content to a cloud storage service. The video capturing device 105 is mounted on or otherwise connected to the rotary system 110.
  • a user may use a mobile device as a video capturing device to record video clips of a sports event and to consume digital content, such as the recorded video clips or highlights thereof.
  • the user uses the mobile device to perform functions such as, consuming video clip (s) captured by the video capturing device 105, and video highlights generated from the captured video clip (s) .
  • the mobile device of the user can be a smart phone, a tablet, or any other suitable user device or computing device for consuming video clip (s) .
  • the mobile device of the user provides a user interface, such as physical and/or on-screen buttons, with which the user may interact with the mobile device to perform functions such as tagging, viewing, sharing, and/or otherwise consuming the video clip (s) .
  • the rotary system 110 is configured to drive the video capturing device 105 (e.g., a mobile phone of the user having a digital camera) mounted on the rotary system 110 to follow the movement of the target object 160.
  • the rotary system 110 is further described with respect to FIG. 2A and 2B.
  • FIG. 2A illustrates a block diagram of the rotary system 100, according to one embodiment.
  • the rotary system 110 includes a camera holder 210, a camera steering &servo subsystem (CSSS) 220, and a Bluetooth Low Energy (BLE) direction finding subsystem (BDFS) 230.
  • BLE Bluetooth Low Energy
  • BDFS Bluetooth Low Energy
  • the camera holder 210 provides support for attaching a video capturing device 105, such as, a video camera recorder, to the rotary system 110.
  • the camera steering &servo subsystem (CSSS) 220 receives an indication of the direction of the target 160 and moves the camera holder to direct the video capturing device to the identified direction of the target 160.
  • FIG. 3A shows a block diagram of the CSSS 220, according to one embodiment.
  • the CSSS 220 includes one or more gear sets 310, one or more motors 320, a rechargeable battery 330, and a control circuitry 340.
  • the one or more motors rotate the camera holder 210 with one or more degrees of freedom.
  • the CSSS 220 includes two motors that provide movement to the camera holder 210 with two degrees of freedom (e.g., panning and tilting) .
  • the control circuitry 340 implements the camera steering logic to allow the camera to follow the target. For instance, the control circuitry 340 may keep the target within the field of view (FoV) of the camera.
  • the CSSS 320 further includes a gear set 310 and a code wheel to provide a higher precision control to the rotation of the camera holder 310.
  • the CSSS 220 further includes a wireless module to enable communication with other devices (e.g., a remote control or a phone or the video capturing device 105) .
  • the motors 320 are voltage controlled. That is, the rotational velocity of the motor 320 is directly proportional to the voltage applied to the motor 320. Furthermore, the codewheel provides a one-degree precision, while a gear set provides a 1 ⁇ 36 deceleration. That is, the codewheel combined with a gear set 310 provides a 1/36-degree steering step size.
  • the control circuitry 340 controls the steering of the CSSS 220.
  • the CSSS is configured to filter the position information of the target object 160 to increase the smoothness of a video captured by the video capturing device 105, while keeping the target object near the center of the field of vide of the video capturing device 105.
  • the BLE direction finding subsystem (BDFS) 230 finds the direction of the target 160 based on a wireless signal transmitted by a wireless transceiver of the target 160.
  • the wireless signal is a Bluetooth signal.
  • FIG. 3B shows a block diagram of the BDFS 230, according to one embodiment.
  • the BDFS 230 includes multiple antennas 350, one or two wireless radios 360 (e.g., two Bluetooth Low Energy (BLE) radios) , an antenna switch 370, and a microcontroller 390.
  • the BDFS 230 receives power from the battery of the CSSS 220.
  • the BDFS 230 includes a battery for powering the components of the BDFS 230.
  • Each of the wireless radios 360 are coupled to a subset of antennas.
  • the BDFS 230 includes two wireless radios.
  • one wireless radio is coupled to a first antenna (e.g., a reference antenna)
  • the other wireless radio is coupled to the other antennas.
  • the antenna switch 370 controls the wireless radios to switch between antennas. As such, a single radio switch may receive or transmit different wireless signals using different antennas 350.
  • the wireless radios 360 receive electrical signals picked up by the antennas 350 and demodulates the electrical signals to extract information carried by the signals.
  • the wireless radios 370 may further modulate information to be sent via the antennas and modulates the information to produce a current that can be used to excite one or more antennas to emit radio waves.
  • the wireless radios 360 are used in conjunction with the antennas 350 to receive wireless signals from the target object 160 for determining the direction of the target object.
  • the wireless radios 360 and the wireless antennas 350 are configured to receive Bluetooth signals.
  • one or more radios 360 are used in conjunction with one or more antennas 350 to communicate with other mobile devices.
  • a rotary system 110 may communicate with another rotary system 110 via Bluetooth using one wireless radio 360 coupled to one antenna 350.
  • the rotary system 110 may also wirelessly communicate with the video capturing device 105.
  • the rotary system 110 may obtain properties of the video capturing device 105, such as, field of view information.
  • the analog-to-digital converter (ADC) 380 converts an analog signal received from a wireless radio 360 into a digital signal.
  • the ADC 380 is a discrete component.
  • the ADC is integrated inside the microcontroller 390.
  • the microcontroller 390 implements the core control logic (CCL) , the direction finding (DF) algorithm, and the post filtering of the resulting directions.
  • the CCL performs admission control of BLE devices by obtaining the BLE device ID information from the BLE chip and processes incoming data from BLE chips that are included in a whitelist.
  • the CCL further performs A/D control by indicating the ADC 380 when to sample signals.
  • the CCL further performs antenna switch control by indicating the antenna switch 370 to switch among antennas at a certain time and according to a certain antenna switch pattern (ASP) .
  • ASP antenna switch pattern
  • the DF algorithm performs multiple signal classification and determines the parameters of multiple wavefronts arriving at the antennas 350 from the signal measurements made at each of the antennas.
  • the DF algorithm may determine the number of incoming signals, the strength of signals and noise, the cross correlation among directional waveforms, direction of arrival of the incoming signals, and the frequency components of the incoming signals.
  • the post filtering removes outliers and smoothes out the jitters from the obtained set of directions.
  • the obtained set of directions may be noisy due to system noises and multipath effects of the incoming signal.
  • the post filtering may include removing directions that are not possible or unlikely to happen in practice. For instance, directions that correspond to underground locations may be filtered out for the position of a basketball in a game since it is unlikely the basketball will be underground during an actual gameplay. Examples of post filtering include the Kalman filter.
  • FIG. 4A illustrates a top and side view of an antenna array arranged in a straight line, according to one embodiment.
  • the antenna array of FIG. 4A includes 5 individual antennas arranged in a straight line. In other embodiments, the antenna array may include a different number of antennas.
  • the antenna array includes a reference antenna 410 and multiple peripheral antennas 420.
  • the reference antenna can be any antenna from the antenna array. In the embodiment of FIG. 4A, the reference antenna is at the center of the antenna array. In some embodiments, the antennas are separated with a distance in the range of ⁇ /4 and ⁇ /2, where ⁇ is the wavelength of the wireless signal being received by the antennas.
  • FIG. 4B illustrates a top and side view of an antenna array arranged in a circular pattern, according to one embodiment.
  • the antenna array of FIG. 4B includes multiple peripheral antennas 420 arranged in a circular pattern, and a reference antenna 410 arranged in the center of the peripheral antennas 420.
  • the antenna array includes an even number of peripheral antennas 420 and are evenly spaced. As such, the reference antenna 410 is in between pairs of peripheral antennas 420.
  • FIG. 5A illustrates the direction finding of an incoming signal in a uniform linear antenna array, according to one embodiment.
  • the antenna array includes N+1 antennas, each spaced with a distance d.
  • the incident angle ⁇ to all antennas are substantially equal.
  • the received signals can form a vector as:
  • FIG. 5B illustrates the direction finding of an incoming signal in a circular antenna array, according to one embodiment.
  • antenna 0 reference antenna 420
  • antenna 1 are along the orientation baseline
  • the effective incident angle for the k-th antenna is
  • the steering vector would be as follows:
  • the circular antenna array exhibits better omni-sensitivity to different incoming directions.
  • one wireless radio 360 may be shared among multiple antennas 350 that are multiplexed using antenna switch 370.
  • the BDFS may have a single wireless radio that can be used in conjunction with any of the antennas in the antenna array 350.
  • the BDFS includes a first wireless radio 360 that can be used in conjunction with a first subset of antennas of the antenna array 350, and a second wireless radio 360 that can be used in conjunction with a second subset of antennas of the antenna array 350.
  • different schemes may be used to determine the frequency of and phase of an incoming signal to be able to determine the direction of the incoming signal.
  • the schemes include frequency compensation, frequency cancellation, and dual radios.
  • the frequency compensation method predicts the phase changes for the interval since the sampling at the previous antenna, and compensates the frequency offset of the signal when calculating the phase differences among different antennas.
  • An instant phase information is calculated from the in-phase and quadrature components of a sample, and a series of phase changes ( ⁇ ) are obtained. Knowing the sampling rate, a phase change rate may be estimated, with which an estimation of the signal frequency may be obtained when switching to a next antenna.
  • the k-th sample at the i-th antenna can be represented as:
  • T s is the sampling interval, and is the phase change relative to the reference antenna due to the array geometry.
  • ⁇ k (n) represent the phase and omitting the sampling interval T s :
  • ⁇ k (n) 2 ⁇ f (t k + n) + ⁇ + ⁇ k (6)
  • the frequency of the single may be estimated using the above equations.
  • the phase may be represented as:
  • ⁇ m (n) 2 ⁇ f (t m + n) + ⁇ + ⁇ m (7)
  • the phase difference ⁇ k - ⁇ m can be estimated between the k-th and the m-th antenna.
  • the frequency cancelation method alternates the antenna switching pattern (ASP) .
  • the reference antenna is sampled between the sampling of consecutive antennas. That is, the reference antenna is sampled between the sampling of a k-th antenna and the sampling of a (k+1) -th antenna.
  • a stride-3 ASP (S3-ASP) switching pattern where the reference antenna is sampled every 3 samples, looks as follows:
  • ⁇ 0 (n) 2 ⁇ f (t 0 + n) + ⁇ + ⁇ 0 (8)
  • ⁇ k (n + T) 2 ⁇ f (t 0 + T + n) + ⁇ + ⁇ k
  • ⁇ k (n + 2T) 2 ⁇ f (t 0 + 2T + n) + ⁇ + ⁇ k
  • ⁇ 0 (n + 3T) 2 ⁇ f (t 0 + 3T + n) + ⁇ + ⁇ 0
  • stride-2 ASP looks as follows:
  • the S2-ASP reduces the overall time to finish one round of measurements across all antennas compared to the S3-ASP.
  • FIG. 6 illustrates a flow diagram of a method for determining the direction of a target object using an antenna switching pattern, according to one embodiment.
  • the reference antenna 410 is selected 610 using antenna switch 370 and the reference antenna is sampled 615 using a wireless radio 360.
  • the first peripheral antenna 420 is then selected 620 by the antenna switch 370 and the first peripheral antenna 420 is sampled 625 using the wireless radio 360.
  • a set number of samples is obtained for the peripheral antenna 420. For instance, if a S3-ASP is used, the peripheral antenna is sampled twice.
  • the reference antenna 410 is then selected 630 and sampled 635.
  • the antenna switch 360 selects 650 a next peripheral antenna 420 and steps 625 to 640 are repeated. Based on the obtained sampled signals, a direction finding algorithm is performed 660 to determine the direction of the target object. In some embodiments, after the direction finding algorithm is performed, the process goes back to step 620 and steps 620 to 660 are repeated.
  • FIG. 7 shows a circular antenna array with two wireless radios, according to one embodiment.
  • the reference antenna 410 (antenna 0) is coupled to a first radio front end (radio front end A) 715A
  • the peripheral antennas 420 are coupled to antenna switch 710, which in turn, is coupled to a second radio front end (radio front end B) 715B.
  • each of the peripheral antennas 420 are connected to the antenna switch 710 using antenna cables with equal or substantially equal lengths.
  • the radio front end A 715A is coupled to a first BLE chip (BLE chip A) 720, which is coupled to a first ADC (ADC A) 725A.
  • BLE chip A BLE chip A
  • the front end B 715B is coupled to a second BLE chip (BLE chip B) 720B, which is coupled to a second ADC (ADC B) 725.
  • the antenna switch 710, radio front ends 715A and 715B, BLE chips 720A and 720B, and ADC 725A and 725B are controlled by the CCL 730.
  • the DF algorithm 735 determines the direction of a target object.
  • the output of the DF algorithm 735 may be filtered using a direction filter 740 and the output of the direction filter 740 is provided to the CSSS host interface for controlling the orientation of a video capturing device.
  • the reference antenna 410 is continuously sampled, whereas the peripheral antennas 420 are sampled according to a linear antenna switch pattern.
  • FIG. 8 illustrates a flow diagram of a method for determining the direction of a target object using two wireless radios, according to one embodiment.
  • a signal is sampled 810 from the reference antenna 410 using a first wireless radio (i.e., radio front end A 715A) .
  • a first peripheral antenna 420 is selected 820 by antenna switch 710 and a signal is sampled 825 from the selected antenna using a second wireless radio (i.e., radio front end B 715B) . If there are more peripheral antennas to sample, as determined in step 830, the antenna switch 710 then selects 840 a next peripheral antenna 420 of the antenna array. After the next peripheral antenna 420 is selected steps 810 to 830 are repeated using the selected peripheral antenna.
  • the reference antenna is sampled 810 using the first wireless radio and the selected next antenna is sampled using the second wireless radio. Otherwise, if all the peripheral antennas have been sampled, a direction finding algorithm is performed 830 using the signals sampled by the first and second wireless radios.
  • FIG. 9 illustrates a diagram for reducing a multipath effect of signals on a circular antenna array, according to one embodiment.
  • neighboring antennas may affect the performance of the direction finding algorithm. For instance, in the circular antenna array of FIG. 9, antennas located in the front based on the direction of the incoming signal may receive more energy from the direct path of the target device, whereas antennas located in the back may receive a weaker signal and will likely include energy from a reflected path.
  • each group is based on three antennas that are locate along a diameter of the circle formed by the peripheral antennas. That is, each group is based on two peripheral antennas located opposite to each other and the reference antenna.
  • a first group (group A) is formed based on antennas 0, 1, and 4; a second group is formed based on antennas 0, 2, and 5; and a third group (group C) is formed based on antennas 0, 3, and 6.
  • group A is formed based on antennas 0, 1, and 4
  • a second group is formed based on antennas 0, 2, and 5
  • a third group is formed based on antennas 0, 3, and 6.
  • each set of three antennas the groups are based on split the remaining peripheral antennas into two sets.
  • each of the groups include the set of three antennas the group is based on and one of the two sets of remaining peripheral antennas.
  • the first group (group A) includes antennas 0, 1, 2, 3, and 4; the second group includes antennas 0, 2, 3, 4, and 5; and the third group (group C) includes antennas 0, 3, 4, 5, and 6.
  • the first group includes antennas 0, 1, 4, 5, and 6; the second group includes antennas 0, 1 2, 5, and 6; and the third group includes antennas 0, 1, 2, 3, and 6.
  • Each group of antennas has a range of directions for which the direction finding algorithm is more accurate than the other groups.
  • group A is more accurate at determining the direction of a target object located at direction 910 compared to other groups.
  • the direction of the target object may be determined using each antenna group and a direction is selected based on the accuracy of each of the groups for identifying a target object located at each of the determined directions. That is, a first direction for the target object is determined using the first antenna group, a second direction is determined using the second antenna group, and a third direction is determined using the third antenna group. Then an accuracy for the first group to determine a target object in the first direction is determined.
  • an accuracy for the second group and the third group to determine a target object located in the second and third direction, respectively, is determined. Based on the accuracy determined for each of the antenna groups, a single direction for the target object is selected. That is, if the accuracy of the first group for identifying an object located in the first direction is greater than the accuracy of the second group for identifying an object located in the second direction and the accuracy of the third group for identifying an object located in the third direction, then the first direction is selected as the direction of the target object.
  • a single direction is identified for the target object using every antenna of the antenna array, and a determination is made whether the signal characteristics of the sampled signals agree the expected signal characteristics to be sampled at each antenna based on the determined direction of the target object. That is, if the identified direction of the target object is direction 910, a determination is made whether the signals sampled for antennas 2 and 3 have a higher energy than the other antennas. Similarly, if the identified direction of the target object is direction 920, a determination is made whether the signal sampled for antennas 4 and 5 have higher energy than the other antennas. If a determination is made that the characteristics of the sampled signals do not agree with the expected characteristics of those signals, the identified direction is considered to be inaccurate. As such, the identified direction may be filtered out.
  • the camera steering control logic uses a “dead zone” when deciding whether to rotate the video capturing device 105, as shown in FIG. 10.
  • the dead zone spans ⁇ degrees to the left and right side of the current camera orientation. As long as the target object 160 stays within the dead zone, the video capturing device 105 is not rotated by the rotary system 110.
  • the dead zone is a quarter of the field of view (FoV) of the video capturing device 105.
  • the camera steering control logic further uses an “overshooting zone” that spans degrees to the left and right side of the FoV of the video capturing device 105 after being rotated by the rotary system 110.
  • the camera steering control logic identifies an orientation of the video capturing device 105 so that the target object 160 is within the overshooting zone.
  • the camera steering control logic identifies the orientation to rotate the video capturing device to by identifying a zone surrounding the target object 160 having a size equal to the overshooting zone.
  • a video capturing device orientation that is within the identified zone is the determined as the orientation the video capturing device 105 is to be rotated to.
  • the camera steering control logic further controls the speed of the rotation of the video capturing device 105 based on the speed of the target object 160.
  • the rotational speed of the video capturing device 105 is linearly proportional to the speed of the target object 105.
  • the rotational speed of the video capturing device 105 is an exponential function of the speed of the target object 160.
  • the rotary system 110 may track the movement of more than target object. For instance, the rotary system 110 may track the movement of multiple player in a game of soccer. In another example, the rotary system 110 may track the movement of one or more players and the ball that is currently in play. In this scenario, the different target objects being tracked may be prioritized as not every target object can be within the FoV of the video capturing device 105 at all times. For instance, the rotary system 110 may prioritize the tracking of a ball or a specific player in a game. If multiple target objects are prioritized, the rotary system 110 may select a direction that increases the number of tracked objects that are within the FoV of the video capturing device 105.
  • the rotary system may further select a direction that reduces the amount of rotation of the video capturing device.
  • the rotation system determines a score that is based on the number target objects 160 that would be within the FoV of the video capturing device 105 and the amount of rotation to position the video capturing device 105 in that direction. The rotary system 110 then selects the orientation with the largest score as the orientation to rotate the video capturing deice 105.
  • the direction of the target object 160 may be determined using a single rotary system 105. If multiple rotary systems 105 are used, the position of the target object 160 may be determined.
  • FIG. 11 shows a diagram of two rotary systems 110A and 110B tracking an object 160 moving from position 1110A to position 1110B, according to one embodiment. Using rotary system 110A, angles ⁇ 1 and ⁇ 2 can be found. Similarly, using rotary system 110B, angles ⁇ 1 and ⁇ 2 can be found. If the position of the rotary systems 110A and 110B is known, the positions 1110A and 1110B can be determined.
  • the distance d between the first position 1110A and the second position 1110B can be found as:
  • the distance d between the first position 1110A and the second position 1110B may be determined using an inertial measurement unit.
  • the position of the target object may be found without knowledge of the relative position of the rotary systems 110.
  • Modules may constitute either software modules (e.g., code embodied on a machine-readable medium or in a transmission signal) or hardware modules.
  • a hardware module is tangible unit capable of performing certain operations and may be configured or arranged in a certain manner.
  • one or more computer systems e.g., a standalone, client or server computer system
  • one or more hardware modules of a computer system e.g., a processor or a group of processors
  • software e.g., an application or application portion
  • a hardware module may be implemented mechanically or electronically.
  • a hardware module may comprise dedicated circuitry or logic that is permanently configured (e.g., as a special-purpose processor, such as a field programmable gate array (FPGA) or an application-specific integrated circuit (ASIC)) to perform certain operations.
  • a hardware module may also comprise programmable logic or circuitry (e.g., as encompassed within a general-purpose processor or other programmable processor) that is temporarily configured by software to perform certain operations. It will be appreciated that the decision to implement a hardware module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.
  • processors may be temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions.
  • the modules referred to herein may, in some example embodiments, comprise processor-implemented modules.
  • the one or more processors may also operate to support performance of the relevant operations in a “cloud computing” environment or as a “software as a service” (SaaS) .
  • SaaS software as a service
  • at least some of the operations may be performed by a group of computers (as examples of machines including processors) , these operations being accessible via a network (e.g., the Internet) and via one or more appropriate interfaces (e.g., application program interfaces (APIs) . )
  • APIs application program interfaces
  • the performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines.
  • the one or more processors or processor-implemented modules may be located in a single geographic location (e.g., within a home environment, an office enviromnent, or a server farm) . In other example embodiments, the one or more processors or processor-implemented modules may be distributed across a number of geographic locations.
  • any reference to “one embodiment” or “an embodiment” means that a particular element, feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment.
  • the appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment.
  • Coupled and “connected” along with their derivatives.
  • some embodiments may be described using the term “coupled” to indicate that two or more elements are in direct physical or electrical contact.
  • the term “coupled, ” may also mean that two or more elements are not in direct contact with each other, but yet still co-operate or interact with each other.
  • the embodiments are not limited in this context.
  • the terms “comprises, ” “comprising, ” “includes, ” “including, ” “has, ” “having” or any other variation thereof, are intended to cover a non-exclusive inclusion.
  • a process, method, article, or apparatus that comprises a list of elements is not necessarily limited to only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
  • “or” refers to an inclusive or and not to an exclusive or.
  • condition A or B is satisfied by any one of the following: A is true (or present) and B is false (or not present) , A is false (or not present) and B is true (or present) , and both A and B are true (or present) .

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Variable-Direction Aerials And Aerial Arrays (AREA)

Abstract

An object tracking device includes a direction finding subsystem and a steering subsystem. The direction finding subsystem includes multiple peripheral antennas, a reference antenna, an antenna switch, a first wireless radio for receiving a wireless signal from the target object via a peripheral antenna selected by the antenna switch, a second wireless radio for receiving a wireless signal from the target object via a peripheral antenna selected by the antenna switch, and a processing unit. The processing unit controls the antenna switch and determines the direction of the target object based on signals sampled by the first wireless radio and the second wireless radio. The steering subsystem includes a holder, one or more motors for rotating the holder, and a control unit for receiving the direction of the target object determined by the direction finding subsystem and controlling the motors based on the received direction of the target object.

Description

DIRECTION FINDING OF WIRELESS COMMUNICATION DEVICES BACKGROUND
1. Field of Art
This disclosure relates generally to digital content processing and particularly to capturing video highlights in sports videos by tracking the position of a target object and synchronizing a video capturing device (e.g., a mobile phone) on a rotary station with the movement of the target object.
2. Description of the Related Art
With the advancement of mobile computing devices such as smart phones and tablets, more and more people are able to record videos of various types of events, such as recording sports videos and sharing video highlights in the recorded sports videos with others. A video highlight of a sports video is a portion of the sports video and represents a semantically important event captured in the sports video (e.g., a short video clip capturing goals or goal attempts in a soccer game video) .
Steadily recording a long sports event by a viewer using his/her smart phone and tablet can be challenging. One of the challenges is from the difficulty of capturing rapid movements in a sports event over a large sports field. For soccer, basketball, baseball, hockey, and other similar sports, the sports field is typically very large and players spread out across the large sports field. Thus, viewers/audiences of the sports event have to frequently turn their heads to track the players’ activities during the sports game/event. Another challenge is to predict when a video highlight is to happen in a live event. Thus, in order to capture an unpredictable video highlight in a sports game/event or a live event, a viewer has  to stay alert for the whole sports game/event, which can be tiring, frustrating for missing the highlights of the sports game/event, and degrading the joy of watching the sports game/event.
SUMMARY
Embodiments of the disclosure provide an object tracking device for tracking the movement of a target object. The object tracking device includes a direction finding subsystem and a steering subsystem. The direction finding subsystem determines the direction of the target object. The direction finding subsystem includes multiple peripheral antennas, a reference antenna, an antenna switch coupled to the peripheral antennas, a first wireless radio, a second wireless radio, and a processing unit. The antenna switch selects one of the plurality of peripheral antennas. The first wireless radio receives a wireless signal from the target object via a peripheral antenna selected by the antenna switch. The second wireless radio receives the wireless signal from the target object via the reference antenna. The processing unit controls the antenna switch and determines the direction of the target object based on signals sampled by the first wireless radio and the second wireless radio. The steering subsystem includes a holder, one or more motors, and a control unit. The holder is configured to be attached to a video capturing device (e.g., a video camera) . The one or more motors are configured to rotate the holder. The control unit receives the direction of the target object determined by the direction finding subsystem and controls the one or more motors based on the received direction of the target object.
Embodiments also include a method for determining the direction of a target object. A first wireless signal is sampled using a first peripheral antenna of a plurality of peripheral antennas, and a reference antenna. A phase difference between the first wireless signal sampled by the first peripheral antenna and the first wireless signal sampled by the reference antenna is determined. A second wireless signal is sampled using a second peripheral antenna of the plurality of peripheral antennas, and the reference antenna. A  phase difference between the second wireless signal sampled by the second peripheral antenna and the second wireless signal sampled by the reference antenna is determined. The direction of the target object is determined based on (1) the phase difference between the first wireless signal sampled by the first peripheral antenna and the first wireless signal sampled by the reference antenna, and (2) the phase difference between the second wireless signal sampled by the second peripheral antenna and the second wireless signal sampled by the reference antenna. More such phase differences can be similarly obtained using wireless signal samples from additional peripheral antennas against the same reference antenna, and incorporated into the determination process of the target object
The features and advantages described in the specification are not all inclusive and, in particular, many additional features and advantages will be apparent to one of ordinary skill in the art in view of the drawings, specification, and claims. Moreover, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the disclosed subject matter.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a block diagram of an environment for tracking a wireless communication device and synchronizing a video capturing device (e.g. a mobile phone) connected to a rotary station with the movement of the wireless communication device, according to one embodiment.
FIG. 2A illustrates a block diagram of the rotary system, according to one embodiment.
FIG. 2B illustrates a block diagram of the rotary system, according to another embodiment.
FIG. 3A illustrates a block diagram of the camera steering &servo subsystem, according to one embodiment.
FIG. 3B illustrates a block diagram of the BLE direction finding subsystem, according to one embodiment.
FIG. 4A illustrates a top and side view of an antenna array arranged in a straight line, according to one embodiment.
FIG. 4B illustrates a top and side view of an antenna array arranged in a circular pattern, according to one embodiment.
FIG. 5A illustrates a uniform linear antenna array, according to one embodiment.
FIG. 5B illustrates a circular antenna array, according to one embodiment.
FIG. 6 illustrates a flow diagram of a method for determining the direction of a target object using an antenna switching pattern, according to one embodiment.
FIG. 7 shows a circular antenna array with two wireless radios, according to one embodiment.
FIG. 8 illustrates a flow diagram of a method for determining the direction of a target object using two wireless radios, according to one embodiment.
FIG. 9 illustrates a diagram for reducing a multipath effect of signals on a circular antenna array, according to one embodiment.
FIG. 10 illustrates a diagram of the field of view (FoV) for deciding whether to rotate the video capturing device, according to one embodiment.
FIG. 11 illustrates a diagram of two rotary systems tracking an object moving from a first position to a second position, according to one embodiment.
The figures depict various embodiments of the present disclosure for purposes of illustration only. One skilled in the art will readily recognize from the following discussion  that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles of the disclosure described herein.
DETAILED DESCRIPTION
Environment Overview
A solution is provided to conveniently capture video highlights in a sports event by tracking movement of a target object and synchronizing rotations of a video capturing device (e.g., a mobile phone) on a rotary station with the movement of the target object. The target object includes a wireless module that transmits a wireless signal that can be used to determine the position of the target object. Position data describing the position of the target object, or motion data describing the movement of the target object is transmitted to the rotary station, and processed to generate commands to drive the video capturing device (e.g., a mobile phone) mounted on the rotary station, to follow the movement of the target object.
In this disclosure, “digital content” generally refers to any machine-readable and machine-storable work. Digital content can include, for example, a combination of video and audio. Alternatively, digital content can be a still image. For purposes of simplicity and the description of one embodiment, the digital content will be referred to as a “video” but no limitation on the type of digital content that can be analyzed are intended by this terminology.
FIG. 1 is a block diagram of an environment 100 for tracking a wireless communication device and synchronizing a video capturing device (e.g. a mobile phone) connected to a rotary station with the movement of the wireless communication device, according to one embodiment. The embodiment illustrated in FIG. 1 includes a target 160 that transmits a wireless signal using one or more wireless communication protocols (e.g., Bluetooth or Wi-Fi) , a video capturing device 105 mounted on a rotary system 110. In some embodiments, the rotary system 110 is further mounted on a tripod 150. If the video  capturing device 105 is recording a sports game, the target 160 may be inside a sports field 120.
The target 160 includes a wireless module that transmits a wireless signal using a wireless communication protocol. In some embodiments, the target 160 transmits a data packet using the wireless communication protocol such that the encoded wireless signal has a specific characteristic (e.g., a sinusoidal signal with a single frequency component) . For instance, if the Bluetooth protocol is used, a data packet may be generated so that the encoded wireless signal is a sequence of zeros or a sequence of ones. Even though the Bluetooth protocol is used throughout, the present disclosure is not limited thereto. For instance, other wireless protocols such as Wi-Fi, radio-frequency identification (RFID) or Zigbee may be used instead.
The video capturing device 105 is an electronic device used to capture digital content such as recoding a video clip of a sports event happening on the sports field 120. In one embodiment, the video capturing device 105 is a mobile phone and includes an image capturing device and a transmitter for transmitting the captured digital content. The video capturing device 105 can be, for example, a smart phone, a tablet, a digital single-lens reflex camera (DSLR) , or any other suitable user device or computing device for capturing digital content. For simplicity, the various electronic devices that are capable of capturing digital content are generally referred to as the video capturing device 105. The video capturing device 106 can be remotely triggered by a remote control to capture digital content. In some configurations, the video capturing device 105 captures digital content for a predetermined period of time (e.g., 30 seconds) . In other configurations, the video capturing device 105 begins capturing digital content when remotely triggered by the remote control and ends capturing digital content when again remotely triggered by the remote control. In addition, the remote control may have certain function buttons to tag the content being captured, and  have certain sensors to incorporate rich content tags such a voice tag. The video capturing device 105 can transmit the captured digital content to a cloud storage service. The video capturing device 105 is mounted on or otherwise connected to the rotary system 110.
A user may use a mobile device as a video capturing device to record video clips of a sports event and to consume digital content, such as the recorded video clips or highlights thereof. For example, the user uses the mobile device to perform functions such as, consuming video clip (s) captured by the video capturing device 105, and video highlights generated from the captured video clip (s) . The mobile device of the user can be a smart phone, a tablet, or any other suitable user device or computing device for consuming video clip (s) . In addition, the mobile device of the user provides a user interface, such as physical and/or on-screen buttons, with which the user may interact with the mobile device to perform functions such as tagging, viewing, sharing, and/or otherwise consuming the video clip (s) .
The rotary system 110 is configured to drive the video capturing device 105 (e.g., a mobile phone of the user having a digital camera) mounted on the rotary system 110 to follow the movement of the target object 160. The rotary system 110 is further described with respect to FIG. 2A and 2B.
FIG. 2A illustrates a block diagram of the rotary system 100, according to one embodiment. The rotary system 110 includes a camera holder 210, a camera steering &servo subsystem (CSSS) 220, and a Bluetooth Low Energy (BLE) direction finding subsystem (BDFS) 230. As described hereinabove, even though, for ease of explanation, BLE radios are used, other radios such as Wi-Fi or RFID radios may be used instead. The camera holder 210 provides support for attaching a video capturing device 105, such as, a video camera recorder, to the rotary system 110.
The camera steering &servo subsystem (CSSS) 220 receives an indication of the direction of the target 160 and moves the camera holder to direct the video capturing device  to the identified direction of the target 160. FIG. 3A shows a block diagram of the CSSS 220, according to one embodiment. The CSSS 220 includes one or more gear sets 310, one or more motors 320, a rechargeable battery 330, and a control circuitry 340. The one or more motors rotate the camera holder 210 with one or more degrees of freedom. In some embodiments, the CSSS 220 includes two motors that provide movement to the camera holder 210 with two degrees of freedom (e.g., panning and tilting) . The control circuitry 340 implements the camera steering logic to allow the camera to follow the target. For instance, the control circuitry 340 may keep the target within the field of view (FoV) of the camera. In some embodiments, the CSSS 320 further includes a gear set 310 and a code wheel to provide a higher precision control to the rotation of the camera holder 310. In some embodiments, the CSSS 220 further includes a wireless module to enable communication with other devices (e.g., a remote control or a phone or the video capturing device 105) .
In some embodiments, the motors 320 are voltage controlled. That is, the rotational velocity of the motor 320 is directly proportional to the voltage applied to the motor 320. Furthermore, the codewheel provides a one-degree precision, while a gear set provides a 1∶36 deceleration. That is, the codewheel combined with a gear set 310 provides a 1/36-degree steering step size.
The control circuitry 340 controls the steering of the CSSS 220. In some embodiments, the CSSS is configured to filter the position information of the target object 160 to increase the smoothness of a video captured by the video capturing device 105, while keeping the target object near the center of the field of vide of the video capturing device 105.
The BLE direction finding subsystem (BDFS) 230 finds the direction of the target 160 based on a wireless signal transmitted by a wireless transceiver of the target 160. In some embodiments, the wireless signal is a Bluetooth signal. In other embodiments, other wireless communication protocols may be used. FIG. 3B shows a block diagram of the  BDFS 230, according to one embodiment. The BDFS 230 includes multiple antennas 350, one or two wireless radios 360 (e.g., two Bluetooth Low Energy (BLE) radios) , an antenna switch 370, and a microcontroller 390. In some embodiments, the BDFS 230 receives power from the battery of the CSSS 220. In other embodiments, the BDFS 230 includes a battery for powering the components of the BDFS 230.
Each of the wireless radios 360 are coupled to a subset of antennas. In some embodiments, the BDFS 230 includes two wireless radios. In this embodiment, one wireless radio is coupled to a first antenna (e.g., a reference antenna) , and the other wireless radio is coupled to the other antennas. The antenna switch 370 controls the wireless radios to switch between antennas. As such, a single radio switch may receive or transmit different wireless signals using different antennas 350. The wireless radios 360 receive electrical signals picked up by the antennas 350 and demodulates the electrical signals to extract information carried by the signals. The wireless radios 370 may further modulate information to be sent via the antennas and modulates the information to produce a current that can be used to excite one or more antennas to emit radio waves.
The wireless radios 360 are used in conjunction with the antennas 350 to receive wireless signals from the target object 160 for determining the direction of the target object. In some embodiments, the wireless radios 360 and the wireless antennas 350 are configured to receive Bluetooth signals. In some embodiments, one or more radios 360 are used in conjunction with one or more antennas 350 to communicate with other mobile devices. For instance, a rotary system 110 may communicate with another rotary system 110 via Bluetooth using one wireless radio 360 coupled to one antenna 350. The rotary system 110 may also wirelessly communicate with the video capturing device 105. For instance the rotary system 110 may obtain properties of the video capturing device 105, such as, field of view information.
The analog-to-digital converter (ADC) 380 converts an analog signal received from a wireless radio 360 into a digital signal. In some embodiments, the ADC 380 is a discrete component. In other embodiments, the ADC is integrated inside the microcontroller 390.
The microcontroller 390 implements the core control logic (CCL) , the direction finding (DF) algorithm, and the post filtering of the resulting directions. The CCL performs admission control of BLE devices by obtaining the BLE device ID information from the BLE chip and processes incoming data from BLE chips that are included in a whitelist. The CCL further performs A/D control by indicating the ADC 380 when to sample signals. The CCL further performs antenna switch control by indicating the antenna switch 370 to switch among antennas at a certain time and according to a certain antenna switch pattern (ASP) .
The DF algorithm performs multiple signal classification and determines the parameters of multiple wavefronts arriving at the antennas 350 from the signal measurements made at each of the antennas. The DF algorithm may determine the number of incoming signals, the strength of signals and noise, the cross correlation among directional waveforms, direction of arrival of the incoming signals, and the frequency components of the incoming signals.
The post filtering removes outliers and smoothes out the jitters from the obtained set of directions. The obtained set of directions may be noisy due to system noises and multipath effects of the incoming signal. For example, the post filtering may include removing directions that are not possible or unlikely to happen in practice. For instance, directions that correspond to underground locations may be filtered out for the position of a basketball in a game since it is unlikely the basketball will be underground during an actual gameplay. Examples of post filtering include the Kalman filter.
Antenna Array Configurations
FIG. 4A illustrates a top and side view of an antenna array arranged in a straight line, according to one embodiment. The antenna array of FIG. 4A includes 5 individual antennas arranged in a straight line. In other embodiments, the antenna array may include a different number of antennas. In some embodiments, the antenna array includes a reference antenna 410 and multiple peripheral antennas 420. The reference antenna can be any antenna from the antenna array. In the embodiment of FIG. 4A, the reference antenna is at the center of the antenna array. In some embodiments, the antennas are separated with a distance in the range of λ/4 and λ/2, where λ is the wavelength of the wireless signal being received by the antennas.
FIG. 4B illustrates a top and side view of an antenna array arranged in a circular pattern, according to one embodiment. The antenna array of FIG. 4B includes multiple peripheral antennas 420 arranged in a circular pattern, and a reference antenna 410 arranged in the center of the peripheral antennas 420. In some embodiments, the antenna array includes an even number of peripheral antennas 420 and are evenly spaced. As such, the reference antenna 410 is in between pairs of peripheral antennas 420.
FIG. 5A illustrates the direction finding of an incoming signal in a uniform linear antenna array, according to one embodiment. The antenna array includes N+1 antennas, each spaced with a distance d. When the distance from the subject to the antenna array is very large compared to the antenna spacing, the incident angle θ to all antennas are substantially equal. As such, the k-th antenna receives a wireless signal
Figure PCTCN2017083588-appb-000001
earlier than antenna 0, where c is the radio propagation speed. If the signal received at antenna 0 is x0 (t) = s (t) , then the signal at antenna k is:
Figure PCTCN2017083588-appb-000002
Thus, for all N+1 antennas, the received signals can form a vector as:
Figure PCTCN2017083588-appb-000003
Where a (θ) is a “steering vector. ”
FIG. 5B illustrates the direction finding of an incoming signal in a circular antenna array, according to one embodiment. In this embodiment, assuming antenna 0 (reference antenna 420) and antenna 1 are along the orientation baseline, the effective incident angle for the k-th antenna is
Figure PCTCN2017083588-appb-000004
As such, the steering vector would be as follows:
Figure PCTCN2017083588-appb-000005
The circular antenna array exhibits better omni-sensitivity to different incoming directions.
Direction Finding Schemes
In order to reduce the complexity of BDFS, one wireless radio 360 may be shared among multiple antennas 350 that are multiplexed using antenna switch 370. For instance, instead of having one wireless radio 360 for each antenna 350, the BDFS may have a single wireless radio that can be used in conjunction with any of the antennas in the antenna array 350. In another example, the BDFS includes a first wireless radio 360 that can be used in conjunction with a first subset of antennas of the antenna array 350, and a second wireless radio 360 that can be used in conjunction with a second subset of antennas of the antenna array 350. When multiplexing multiple antennas to be used with a single wireless radio, different schemes may be used to determine the frequency of and phase of an incoming signal  to be able to determine the direction of the incoming signal. In particular, the schemes include frequency compensation, frequency cancellation, and dual radios.
Frequency Compensation Scheme
The frequency compensation method predicts the phase changes for the interval since the sampling at the previous antenna, and compensates the frequency offset of the signal when calculating the phase differences among different antennas. An instant phase information is calculated from the in-phase and quadrature components of a sample, and a series of phase changes (Δφ) are obtained. Knowing the sampling rate, a phase change rate may be estimated, with which an estimation of the signal frequency may be obtained when switching to a next antenna.
Let the source signal be s (t) = aej2πft+φ, then the k-th sample at the i-th antenna can be represented as:
Figure PCTCN2017083588-appb-000006
where tk is the measurement start time for antenna k, Ts is the sampling interval, and
Figure PCTCN2017083588-appb-000007
Figure PCTCN2017083588-appb-000008
is the phase change relative to the reference antenna due to the array geometry. Let ψk (n) represent the phase and omitting the sampling interval Ts
ψk (n) = 2πf (tk + n) + φ + φk                     (6)
If two samples are obtained from an antenna, the frequency of the single may be estimated using the above equations. Similarly, for the m-th antenna, the phase may be represented as:
ψm (n) = 2πf (tm + n) + φ + φm                   (7)
From equations (6) and (7) and with the estimated frequency, the phase difference ψk -ψm can be estimated between the k-th and the m-th antenna.
Frequency Cancellation Scheme
The frequency cancelation method alternates the antenna switching pattern (ASP) . In the ASP for the frequency cancellation method, the reference antenna is sampled between  the sampling of consecutive antennas. That is, the reference antenna is sampled between the sampling of a k-th antenna and the sampling of a (k+1) -th antenna. For example, for an antenna array with N+1 antennas (e.g., the antenna array configuration of FIG. 5B, having antennas 0 through N) , a stride-3 ASP (S3-ASP) switching pattern, where the reference antenna is sampled every 3 samples, looks as follows:
{ant0, ant1, ant1, ant0, ant2, ant2, ant0, ..., ant0, antk, antk, ant0, ..., ant0, antN, antN, ant0} Where ant0 through antN are antennas 0 through N. Using equation (6) it can be obtain that:
ψ0 (n) = 2πf (t0 + n) + φ + φ0                (8)
ψk (n + T) = 2πf (t0 + T + n) + φ + φk
ψk (n + 2T) = 2πf (t0 + 2T + n) + φ + φk
ψ0 (n + 3T) = 2πf (t0 + 3T + n) + φ + φ0
Where T is the antenna switching interval. Then, from equation (8) , it can be obtained that:
Figure PCTCN2017083588-appb-000009
In another example, a stride-2 ASP (S2-ASP) looks as follows:
{ant0, ant1, ant0, ant2, ant0, ..., ant0, antk, ant0, ..., ant0, antN, ant0}
Using equations (6) it can be obtain that:
Figure PCTCN2017083588-appb-000010
The S2-ASP reduces the overall time to finish one round of measurements across all antennas compared to the S3-ASP.
FIG. 6 illustrates a flow diagram of a method for determining the direction of a target object using an antenna switching pattern, according to one embodiment. The reference antenna 410 is selected 610 using antenna switch 370 and the reference antenna is sampled 615 using a wireless radio 360. The first peripheral antenna 420 is then selected 620 by the antenna switch 370 and the first peripheral antenna 420 is sampled 625 using the wireless radio 360. Depending on the ASP, a set number of samples is obtained for the  peripheral antenna 420. For instance, if a S3-ASP is used, the peripheral antenna is sampled twice. The reference antenna 410 is then selected 630 and sampled 635. Ifthere are more peripheral antennas to sample, as determined in step 640, the antenna switch 360 selects 650 a next peripheral antenna 420 and steps 625 to 640 are repeated. Based on the obtained sampled signals, a direction finding algorithm is performed 660 to determine the direction of the target object. In some embodiments, after the direction finding algorithm is performed, the process goes back to step 620 and steps 620 to 660 are repeated.
Dual Radio Scheme
FIG. 7 shows a circular antenna array with two wireless radios, according to one embodiment. As shown in FIG. 7, the reference antenna 410 (antenna 0) is coupled to a first radio front end (radio front end A) 715A, and the peripheral antennas 420 are coupled to antenna switch 710, which in turn, is coupled to a second radio front end (radio front end B) 715B. In some embodiments, each of the peripheral antennas 420 are connected to the antenna switch 710 using antenna cables with equal or substantially equal lengths. The radio front end A 715A is coupled to a first BLE chip (BLE chip A) 720, which is coupled to a first ADC (ADC A) 725A. Similarly, the front end B 715B is coupled to a second BLE chip (BLE chip B) 720B, which is coupled to a second ADC (ADC B) 725. The antenna switch 710, radio front ends 715A and 715B,  BLE chips  720A and 720B, and  ADC  725A and 725B are controlled by the CCL 730. Based on the sampled signals the DF algorithm 735 determines the direction of a target object. In some embodiments, the output of the DF algorithm 735 may be filtered using a direction filter 740 and the output of the direction filter 740 is provided to the CSSS host interface for controlling the orientation of a video capturing device. In this configuration, the reference antenna 410 is continuously sampled, whereas the peripheral antennas 420 are sampled according to a linear antenna switch pattern. Using the dual radio configuration of FIG. 7, it can be determined that:
ψk -ψ0 = ψk (n + T) -ψo (n + T)          (11)
FIG. 8 illustrates a flow diagram of a method for determining the direction of a target object using two wireless radios, according to one embodiment. A signal is sampled 810 from the reference antenna 410 using a first wireless radio (i.e., radio front end A 715A) . A first peripheral antenna 420 is selected 820 by antenna switch 710 and a signal is sampled 825 from the selected antenna using a second wireless radio (i.e., radio front end B 715B) . If there are more peripheral antennas to sample, as determined in step 830, the antenna switch 710 then selects 840 a next peripheral antenna 420 of the antenna array. After the next peripheral antenna 420 is selected steps 810 to 830 are repeated using the selected peripheral antenna. That is, the reference antenna is sampled 810 using the first wireless radio and the selected next antenna is sampled using the second wireless radio. Otherwise, if all the peripheral antennas have been sampled, a direction finding algorithm is performed 830 using the signals sampled by the first and second wireless radios.
Multipath Effect on Circular Antenna Arrays
FIG. 9 illustrates a diagram for reducing a multipath effect of signals on a circular antenna array, according to one embodiment. In some instances, neighboring antennas may affect the performance of the direction finding algorithm. For instance, in the circular antenna array of FIG. 9, antennas located in the front based on the direction of the incoming signal may receive more energy from the direct path of the target device, whereas antennas located in the back may receive a weaker signal and will likely include energy from a reflected path.
To reduce the multipath effect of signals, a plurality of antenna groups are formed. In the specific embodiment of FIG. 9, each group is based on three antennas that are locate along a diameter of the circle formed by the peripheral antennas. That is, each group is based on two peripheral antennas located opposite to each other and the reference antenna.  For instance, in the diagram of FIG. 9, a first group (group A) is formed based on  antennas  0, 1, and 4; a second group is formed based on  antennas  0, 2, and 5; and a third group (group C) is formed based on  antennas  0, 3, and 6. As such, each set of three antennas the groups are based on split the remaining peripheral antennas into two sets. Thus, each of the groups include the set of three antennas the group is based on and one of the two sets of remaining peripheral antennas. For example, the first group (group A) includes  antennas  0, 1, 2, 3, and 4; the second group includes  antennas  0, 2, 3, 4, and 5; and the third group (group C) includes  antennas  0, 3, 4, 5, and 6. In another example, the first group includes  antennas  0, 1, 4, 5, and 6; the second group includes  antennas  0, 1 2, 5, and 6; and the third group includes  antennas  0, 1, 2, 3, and 6.
Each group of antennas has a range of directions for which the direction finding algorithm is more accurate than the other groups. For example group A is more accurate at determining the direction of a target object located at direction 910 compared to other groups. As such, the direction of the target object may be determined using each antenna group and a direction is selected based on the accuracy of each of the groups for identifying a target object located at each of the determined directions. That is, a first direction for the target object is determined using the first antenna group, a second direction is determined using the second antenna group, and a third direction is determined using the third antenna group. Then an accuracy for the first group to determine a target object in the first direction is determined. Similarly, an accuracy for the second group and the third group to determine a target object located in the second and third direction, respectively, is determined. Based on the accuracy determined for each of the antenna groups, a single direction for the target object is selected. That is, if the accuracy of the first group for identifying an object located in the first direction is greater than the accuracy of the second group for identifying an object located in the second direction and the accuracy of the third group for identifying an object  located in the third direction, then the first direction is selected as the direction of the target object.
In other embodiments, a single direction is identified for the target object using every antenna of the antenna array, and a determination is made whether the signal characteristics of the sampled signals agree the expected signal characteristics to be sampled at each antenna based on the determined direction of the target object. That is, ifthe identified direction of the target object is direction 910, a determination is made whether the signals sampled for  antennas  2 and 3 have a higher energy than the other antennas. Similarly, if the identified direction of the target object is direction 920, a determination is made whether the signal sampled for  antennas  4 and 5 have higher energy than the other antennas. If a determination is made that the characteristics of the sampled signals do not agree with the expected characteristics of those signals, the identified direction is considered to be inaccurate. As such, the identified direction may be filtered out.
Camera Steering Control Logic
The camera steering control logic uses a “dead zone” when deciding whether to rotate the video capturing device 105, as shown in FIG. 10. The dead zone spans θ degrees to the left and right side of the current camera orientation. As long as the target object 160 stays within the dead zone, the video capturing device 105 is not rotated by the rotary system 110. In some embodiments, the dead zone is a quarter of the field of view (FoV) of the video capturing device 105.
The camera steering control logic further uses an “overshooting zone” that spans 
Figure PCTCN2017083588-appb-000011
 degrees to the left and right side of the FoV of the video capturing device 105 after being rotated by the rotary system 110. When the rotary system 110 determines to rotate the video capturing device 105, the camera steering control logic identifies an orientation of the video capturing device 105 so that the target object 160 is within the overshooting zone. In some  embodiments, the camera steering control logic identifies the orientation to rotate the video capturing device to by identifying a zone surrounding the target object 160 having a size equal to the overshooting zone. A video capturing device orientation that is within the identified zone is the determined as the orientation the video capturing device 105 is to be rotated to.
In some embodiments, the camera steering control logic further controls the speed of the rotation of the video capturing device 105 based on the speed of the target object 160. For instance, the rotational speed of the video capturing device 105 is linearly proportional to the speed of the target object 105. In another example, the rotational speed of the video capturing device 105 is an exponential function of the speed of the target object 160.
Multiple Object Tracking
The rotary system 110 may track the movement of more than target object. For instance, the rotary system 110 may track the movement of multiple player in a game of soccer. In another example, the rotary system 110 may track the movement of one or more players and the ball that is currently in play. In this scenario, the different target objects being tracked may be prioritized as not every target object can be within the FoV of the video capturing device 105 at all times. For instance, the rotary system 110 may prioritize the tracking of a ball or a specific player in a game. If multiple target objects are prioritized, the rotary system 110 may select a direction that increases the number of tracked objects that are within the FoV of the video capturing device 105. In some embodiments, ifmultiple orientations result in the same number of tracked objects to be within the field of view of the video capturing device 105, the rotary system may further select a direction that reduces the amount of rotation of the video capturing device. In other embodiments, for multiple different orientations, the rotation system determines a score that is based on the number target objects 160 that would be within the FoV of the video capturing device 105 and the  amount of rotation to position the video capturing device 105 in that direction. The rotary system 110 then selects the orientation with the largest score as the orientation to rotate the video capturing deice 105.
Multi-Platform Coordination
As described hereinabove, the direction of the target object 160 may be determined using a single rotary system 105. If multiple rotary systems 105 are used, the position of the target object 160 may be determined. FIG. 11 shows a diagram of two rotary systems 110A and 110B tracking an object 160 moving from position 1110A to position 1110B, according to one embodiment. Using rotary system 110A, angles α1 and α2 can be found. Similarly, using rotary system 110B, angles β1 and β2 can be found. If the position of the rotary systems 110A and 110B is known, the positions 1110A and 1110B can be determined. In particular, let the position of rotary system 110A be [0, 0] (i.e., the origin of the frame of reference) and the position of rotary system 110B be [0, dAB] , where dAB is the distance between the rotary systems 110A and 110B, then it can be found that:
Figure PCTCN2017083588-appb-000012
Figure PCTCN2017083588-appb-000013
Figure PCTCN2017083588-appb-000014
Figure PCTCN2017083588-appb-000015
Furthermore, the distance d between the first position 1110A and the second position 1110B can be found as:
Figure PCTCN2017083588-appb-000016
In one embodiments, the distance d between the first position 1110A and the second position 1110B may be determined using an inertial measurement unit. In this  embodiment, the position of the target object may be found without knowledge of the relative position of the rotary systems 110.
Additional Configuration Considerations
Throughout this specification, plural instances may implement components, operations, or structures described as a single instance. Although individual operations of one or more methods are illustrated and described as separate operations, one or more of the individual operations may be performed concurrently, and nothing requires that the operations be performed in the order illustrated. Structures and functionality presented as separate components in example configurations may be implemented as a combined structure or component. Similarly, structures and functionality presented as a single component may be implemented as separate components. These and other variations, modifications, additions, and improvements fall within the scope of the subject matter herein.
Certain embodiments are described herein as including logic or a number of components, modules, or mechanisms. Modules may constitute either software modules (e.g., code embodied on a machine-readable medium or in a transmission signal) or hardware modules. A hardware module is tangible unit capable of performing certain operations and may be configured or arranged in a certain manner. In example embodiments, one or more computer systems (e.g., a standalone, client or server computer system) or one or more hardware modules of a computer system (e.g., a processor or a group of processors) may be configured by software (e.g., an application or application portion) as a hardware module that operates to perform certain operations as described herein.
In various embodiments, a hardware module may be implemented mechanically or electronically. For example, a hardware module may comprise dedicated circuitry or logic that is permanently configured (e.g., as a special-purpose processor, such as a field programmable gate array (FPGA) or an application-specific integrated circuit (ASIC)) to  perform certain operations. A hardware module may also comprise programmable logic or circuitry (e.g., as encompassed within a general-purpose processor or other programmable processor) that is temporarily configured by software to perform certain operations. It will be appreciated that the decision to implement a hardware module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.
The various operations of example methods described herein may be performed, at least partially, by one or more processors that are temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions. The modules referred to herein may, in some example embodiments, comprise processor-implemented modules.
The one or more processors may also operate to support performance of the relevant operations in a “cloud computing” environment or as a “software as a service” (SaaS) . For example, at least some of the operations may be performed by a group of computers (as examples of machines including processors) , these operations being accessible via a network (e.g., the Internet) and via one or more appropriate interfaces (e.g., application program interfaces (APIs) . )
The performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the one or more processors or processor-implemented modules may be located in a single geographic location (e.g., within a home environment, an office enviromnent, or a server farm) . In other example embodiments, the one or more processors or processor-implemented modules may be distributed across a number of geographic locations.
Some portions of this specification are presented in terms of algorithms or symbolic representations of operations on data stored as bits or binary digital signals within a machine memory (e.g., a computer memory) . These algorithms or symbolic representations are examples of techniques used by those of ordinary skill in the data processing arts to convey the substance of their work to others skilled in the art. As used herein, an “algorithm” is a self-consistent sequence of operations or similar processing leading to a desired result. In this context, algorithms and operations involve physical manipulation of physical quantities. Typically, but not necessarily, such quantities may take the form of electrical, magnetic, or optical signals capable of being stored, accessed, transferred, combined, compared, or otherwise manipulated by a machine. It is convenient at times, principally for reasons of common usage, to refer to such signals using words such as “data, ” “content, ” “bits, ” “values, ” “elements, ” “symbols, ” “characters, ” “terms, ” “numbers, ” “numerals, ” or the like. These words, however, are merely convenient labels and are to be associated with appropriate physical quantities.
Unless specifically stated otherwise, discussions herein using words such as “processing, ” “computing, ” “calculating, ” “determining, ” “presenting, ” “displaying, ” or the like may refer to actions or processes of a machine (e.g., a computer) that manipulates or transforms data represented as physical (e.g., electronic, magnetic, or optical) quantities within one or more memories (e.g., volatile memory, non-volatile memory, or a combination thereof) , registers, or other machine components that receive, store, transmit, or display information.
As used herein any reference to “one embodiment” or “an embodiment” means that a particular element, feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the  same embodiment.
Some embodiments may be described using the expression “coupled” and “connected” along with their derivatives. For example, some embodiments may be described using the term “coupled” to indicate that two or more elements are in direct physical or electrical contact. The term “coupled, ” however, may also mean that two or more elements are not in direct contact with each other, but yet still co-operate or interact with each other. The embodiments are not limited in this context.
As used herein, the terms “comprises, ” “comprising, ” “includes, ” “including, ” “has, ” “having” or any other variation thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, article, or apparatus that comprises a list of elements is not necessarily limited to only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Further, unless expressly stated to the contrary, “or” refers to an inclusive or and not to an exclusive or. For example, a condition A or B is satisfied by any one of the following: A is true (or present) and B is false (or not present) , A is false (or not present) and B is true (or present) , and both A and B are true (or present) .
In addition, use of the “a” or “an” are employed to describe elements and components of the embodiments herein. This is done merely for convenience and to give a general sense of the invention. This description should be read to include one or at least one and the singular also includes the plural unless it is obvious that it is meant otherwise.
Upon reading this disclosure, those of skill in the art will appreciate still additional alternative structural and functional designs through the disclosed principles herein. Thus, while particular embodiments and applications have been illustrated and described, it is to be understood that the disclosed embodiments are not limited to the precise construction and components disclosed herein. Various modifications, changes and variations, which will be  apparent to those skilled in the art, may be made in the arrangement, operation, and details of the method and apparatus disclosed herein without departing from the spirit and scope defined in the appended claims.

Claims (20)

  1. An object tracking device for tracking the movement of a target object, the object tracking device comprising:
    a direction finding subsystem for determining a direction of the target object, the direction finding subsystem comprising:
    a plurality of peripheral antennas,
    a reference antenna,
    an antenna switch coupled to the plurality of peripheral antennas, the antenna switch for selecting one of the plurality of peripheral antennas,
    a first wireless radio coupled to the antenna switch, the first wireless radio for receiving a wireless signal from the target object via a peripheral antenna selected by the antenna switch,
    a second wireless radio coupled to the reference antennal, the second wireless radio for receiving the wireless signal from the target object via the reference antenna, and
    a processing unit for controlling the antenna switch and determining the direction of the target object based on signals sampled by the first wireless radio and the second wireless radio; and
    a steering subsystem coupled to the direction finding subsystem, the steering subsystem comprising:
    a holder for attaching a video capturing device to the steering subsystem, one or more motors for rotating the holder, and
    a control unit for receiving the direction of the target object determined by the direction finding subsystem and controlling the one or more motors based on the received direction of the target object.
  2. The object tracking device of claim 1, wherein the plurality of peripheral antennas are arranged in a circle, and wherein the reference antenna is in a center of the circle.
  3. The object tracking device of claim 1, wherein the processing unit is configured to:
    control the antenna array to select a first peripheral antenna of the plurality of peripheral antennas;
    control the first wireless radio to sample a first wireless signal using the first peripheral antenna, and control the second wireless radio to sample the first wireless signal using the reference antenna; and
    responsive to the first and second wireless radio sampling the first wireless signal:
    control the antenna array to select a second peripheral antenna of the plurality of peripheral antennas, and
    control the first wireless radio to sample a second wireless signal using the second peripheral antenna, and control the second wireless radio to sample the second wireless signal using the reference antenna.
  4. The object tracking device of claim 3, wherein the processing unit if further configured to:
    determine a first phase difference between the first wireless signal sampled by the first peripheral antenna and the reference antenna;
    determine a second phase difference between the second wireless signal sampled by the second peripheral antenna and the reference antenna; and
    determine a direction of a target object that transmitted the first wireless signal and the second wireless signal based on (1) the first determined phase difference between the first peripheral antenna and the reference antenna, and (2) the  second determined phase difference between the second peripheral antenna and the reference antenna.
  5. The object tracking device of claim 1, wherein the wireless signal is a Bluetooth signal.
  6. The object tracking device of claim 5, wherein the wireless signal corresponds to one of(l) a stream of zeros, and (2) a stream of ones.
  7. The object tracking device of claim 1, wherein the control unit of the steering subsystem is configured to:
    determine whether the direction of the target object is within a field of view of the video capturing device; and
    responsive to determining that the direction of the target object is not within the field of view of the video capturing device:
    rotate the video capturing device so that the direction of the target object is within the field of view of the video capturing device.
  8. The object tracking device of claim 1, wherein the control unit of the steering subsystem is configured to:
    determine whether the direction of the target object is within a first range; and
    responsive to determining that the direction of the target object is not within the first range:
    identify a second range from a plurality of preset ranges based on the direction of the target object, and
    actuate the one or more rotors based on a value associated with the second range.
  9. A direction finding device for determining a direction of the target object, the direction finding device comprising:
    a plurality of peripheral antennas,
    a reference antenna,
    an antenna switch coupled to the plurality of peripheral antennas, the antenna switch for selecting one of the plurality of peripheral antennas;
    a first wireless radio coupled to the antenna switch, the first wireless radio for receiving a wireless signal from the target object via a peripheral antenna selected by the antenna switch;
    a second wireless radio coupled to the reference antennal, the second wireless radio for receiving the wireless signal from the target object via the reference antenna.
    a processing unit for controlling the antenna switch and determining the direction of the target object based on signals sampled by the first wireless radio and the second wireless radio.
  10. The direction finding device of claim 9, wherein the plurality of peripheral antennas are arranged in a circle, and wherein the reference antenna is in a center of the circle.
  11. The direction finding device of claim 10, wherein the processing unit is configured to:
    control the antenna array to select a first peripheral antenna of the plurality of peripheral antennas;
    control the first wireless radio to sample a first wireless signal using the first peripheral antenna, and control the second wireless radio to sample the first wireless signal using the reference antenna; and
    responsive to the first and second wireless radio sampling the first wireless signal:
    control the antenna array to select a second peripheral antenna of the plurality of peripheral antennas, and
    control the first wireless radio to sample a second wireless signal using the second peripheral antenna, and control the second wireless radio to sample the second wireless signal using the reference antenna.
  12. The direction finding device of claim 11, wherein the processing unit if further configured to:
    determine a phase difference between the first wireless signal sampled by the first peripheral antenna and the reference antenna;
    determine a phase difference between the second wireless signal sampled by the second peripheral antenna and the reference antenna; and
    determine a direction of a target object that transmitted the first wireless signal and the second wireless signal based on (1) the determined phase difference between the first peripheral antenna and the reference antenna, and (2) the determined phase difference between the second peripheral antenna and the reference antenna.
  13. The direction finding device of claim 9, wherein the wireless signal is a Bluetooth signal.
  14. The direction finding device of claim 13, wherein the wireless signal corresponds to one of (l) a stream of zeros, and (2) a stream of ones.
  15. A computer-implemented method comprising:
    selecting a first peripheral antenna of a plurality of peripheral antennas;
    sampling a first wireless signal using the first peripheral antenna;
    sampling the first wireless signal using a reference antenna;
    determining a phase difference between the first wireless signal sampled by the first peripheral antenna and first wireless signal sampled by the reference antenna;
    selecting a second peripheral antenna of the plurality of peripheral antennas;
    sampling a second wireless signal using the second peripheral antenna;
    sampling the second wireless signal using the reference antenna;
    determining a phase difference between the second wireless signal sampled by the second peripheral antenna and the second wireless signal sampled by the reference antenna; and
    determining a direction of a target object that transmitted the first wireless signal and the second wireless signal based on (1) the determined phase difference between the first wireless signal sampled by first peripheral antenna and the first wireless signal sampled by the reference antenna, and (2) the determined phase difference between the second wireless signal sampled by the second peripheral antenna and the second wireless signal sampled by the reference antenna.
  16. The computer-implemented method of claim 15, wherein the plurality of peripheral antennas are arranged in a circle, and wherein the reference antenna is in a center of the circle.
  17. The computer-implemented method of claim 15, wherein the wireless signal is a Bluetooth signal.
  18. The computer-implemented method of claim 15, wherein the wireless signal corresponds to one of (l) a stream of zeros, and (2) a stream of ones.
  19. The computer-implemented method of claim 15, further comprising:
    determining whether the direction of the target object is within a field of view of a video capturing device; and
    responsive to determining that the direction of the target object is not within the field of view of the video capturing device:
    rotating the video capturing device so that the direction of the target object is within the field of view of the video capturing device.
  20. The computer-implemented method of claim 15, further comprising:
    determining whether the direction of the target object is within a first range; and
    responsive to determining that the direction of the target object is not within the first range:
    identifying a second range from a plurality of preset ranges based on the direction of the target object, and
    actuating a plurality of rotors based on a value associated with the second range.
PCT/CN2017/083588 2017-05-09 2017-05-09 Direction finding of wireless communication devices WO2018205133A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/083588 WO2018205133A1 (en) 2017-05-09 2017-05-09 Direction finding of wireless communication devices

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/083588 WO2018205133A1 (en) 2017-05-09 2017-05-09 Direction finding of wireless communication devices

Publications (1)

Publication Number Publication Date
WO2018205133A1 true WO2018205133A1 (en) 2018-11-15

Family

ID=64104259

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/083588 WO2018205133A1 (en) 2017-05-09 2017-05-09 Direction finding of wireless communication devices

Country Status (1)

Country Link
WO (1) WO2018205133A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115997430A (en) * 2020-08-25 2023-04-21 上海诺基亚贝尔股份有限公司 Relative phase determination for frequency drift compensation

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140253741A1 (en) * 2013-03-08 2014-09-11 Panasonic Corporation Camera system and switching device
CN104766106A (en) * 2015-03-05 2015-07-08 桑田智能工程技术(上海)有限公司 Intelligent tracking and monitoring system based on RFID label signal intensity
CN105676865A (en) * 2016-04-12 2016-06-15 北京博瑞爱飞科技发展有限公司 Target tracking method, device and system
CN106023251A (en) * 2016-05-16 2016-10-12 西安斯凯智能科技有限公司 Tracking system and tracking method
CN106584516A (en) * 2016-11-01 2017-04-26 河池学院 Intelligent photographing robot for tracing specified object

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140253741A1 (en) * 2013-03-08 2014-09-11 Panasonic Corporation Camera system and switching device
CN104766106A (en) * 2015-03-05 2015-07-08 桑田智能工程技术(上海)有限公司 Intelligent tracking and monitoring system based on RFID label signal intensity
CN105676865A (en) * 2016-04-12 2016-06-15 北京博瑞爱飞科技发展有限公司 Target tracking method, device and system
CN106023251A (en) * 2016-05-16 2016-10-12 西安斯凯智能科技有限公司 Tracking system and tracking method
CN106584516A (en) * 2016-11-01 2017-04-26 河池学院 Intelligent photographing robot for tracing specified object

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115997430A (en) * 2020-08-25 2023-04-21 上海诺基亚贝尔股份有限公司 Relative phase determination for frequency drift compensation

Similar Documents

Publication Publication Date Title
CN105393079B (en) Depth transducer control based on context
US10721384B2 (en) Camera with radar system
EP3354007B1 (en) Video content selection
JP6297278B2 (en) Real-time radio frequency signal visualization device
US10942252B2 (en) Tracking system and tracking method
US20160309095A1 (en) Methods and apparatus for capturing images using multiple camera modules in an efficient manner
CN107852447B (en) Balancing exposure and gain at an electronic device based on device motion and scene distance
US20220236359A1 (en) Cooperative automatic tracking
US10938102B2 (en) Search track acquire react system (STARS) drone integrated acquisition tracker (DIAT)
US20090304374A1 (en) Device for tracking a moving object
CN101027900A (en) System and method for the production of composite images comprising or using one or more cameras for providing overlapping images
CN107404615B (en) Image recording method and electronic equipment
US20130286049A1 (en) Automatic adjustment of display image using face detection
US11050972B1 (en) Systems and methods for generating time-lapse videos
GB2463703A (en) Estimating the direction in which a camera is pointing as a photograph is taken
CN109348170B (en) Video monitoring method and device and video monitoring equipment
CN108370412B (en) Control device, control method, and recording medium
CN111213365A (en) Shooting control method and controller
WO2018205133A1 (en) Direction finding of wireless communication devices
CN113784041A (en) Automatic tracking photography holder and method based on UWB
US20170019585A1 (en) Camera clustering and tracking system
CN115150748B (en) Indoor positioning method, system, electronic equipment and storage medium
Fang et al. Person tracking and identification using cameras and Wi-Fi Channel State Information (CSI) from smartphones: dataset
CN107979731B (en) Method, device and system for acquiring audio and video data
CN109155818B (en) Head rotation tracking device for video highlight recognition

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17909102

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17909102

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 16.03.2020)

122 Ep: pct application non-entry in european phase

Ref document number: 17909102

Country of ref document: EP

Kind code of ref document: A1