WO2018205133A1 - Direction finding of wireless communication devices - Google Patents
Direction finding of wireless communication devices Download PDFInfo
- Publication number
- WO2018205133A1 WO2018205133A1 PCT/CN2017/083588 CN2017083588W WO2018205133A1 WO 2018205133 A1 WO2018205133 A1 WO 2018205133A1 CN 2017083588 W CN2017083588 W CN 2017083588W WO 2018205133 A1 WO2018205133 A1 WO 2018205133A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- antenna
- peripheral
- wireless signal
- target object
- wireless
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/292—Multi-camera tracking
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30221—Sports video; Sports image
Definitions
- This disclosure relates generally to digital content processing and particularly to capturing video highlights in sports videos by tracking the position of a target object and synchronizing a video capturing device (e.g., a mobile phone) on a rotary station with the movement of the target object.
- a video capturing device e.g., a mobile phone
- a video highlight of a sports video is a portion of the sports video and represents a semantically important event captured in the sports video (e.g., a short video clip capturing goals or goal attempts in a soccer game video) .
- Embodiments of the disclosure provide an object tracking device for tracking the movement of a target object.
- the object tracking device includes a direction finding subsystem and a steering subsystem.
- the direction finding subsystem determines the direction of the target object.
- the direction finding subsystem includes multiple peripheral antennas, a reference antenna, an antenna switch coupled to the peripheral antennas, a first wireless radio, a second wireless radio, and a processing unit.
- the antenna switch selects one of the plurality of peripheral antennas.
- the first wireless radio receives a wireless signal from the target object via a peripheral antenna selected by the antenna switch.
- the second wireless radio receives the wireless signal from the target object via the reference antenna.
- the processing unit controls the antenna switch and determines the direction of the target object based on signals sampled by the first wireless radio and the second wireless radio.
- the steering subsystem includes a holder, one or more motors, and a control unit.
- the holder is configured to be attached to a video capturing device (e.g., a video camera) .
- the one or more motors are configured to rotate the holder.
- the control unit receives the direction of the target object determined by the direction finding subsystem and controls the one or more motors based on the received direction of the target object.
- Embodiments also include a method for determining the direction of a target object.
- a first wireless signal is sampled using a first peripheral antenna of a plurality of peripheral antennas, and a reference antenna.
- a phase difference between the first wireless signal sampled by the first peripheral antenna and the first wireless signal sampled by the reference antenna is determined.
- a second wireless signal is sampled using a second peripheral antenna of the plurality of peripheral antennas, and the reference antenna.
- a phase difference between the second wireless signal sampled by the second peripheral antenna and the second wireless signal sampled by the reference antenna is determined.
- the direction of the target object is determined based on (1) the phase difference between the first wireless signal sampled by the first peripheral antenna and the first wireless signal sampled by the reference antenna, and (2) the phase difference between the second wireless signal sampled by the second peripheral antenna and the second wireless signal sampled by the reference antenna. More such phase differences can be similarly obtained using wireless signal samples from additional peripheral antennas against the same reference antenna, and incorporated into the determination process of the target object
- FIG. 1 is a block diagram of an environment for tracking a wireless communication device and synchronizing a video capturing device (e.g. a mobile phone) connected to a rotary station with the movement of the wireless communication device, according to one embodiment.
- a video capturing device e.g. a mobile phone
- FIG. 2A illustrates a block diagram of the rotary system, according to one embodiment.
- FIG. 2B illustrates a block diagram of the rotary system, according to another embodiment.
- FIG. 3A illustrates a block diagram of the camera steering &servo subsystem, according to one embodiment.
- FIG. 3B illustrates a block diagram of the BLE direction finding subsystem, according to one embodiment.
- FIG. 4A illustrates a top and side view of an antenna array arranged in a straight line, according to one embodiment.
- FIG. 4B illustrates a top and side view of an antenna array arranged in a circular pattern, according to one embodiment.
- FIG. 5A illustrates a uniform linear antenna array, according to one embodiment.
- FIG. 5B illustrates a circular antenna array, according to one embodiment.
- FIG. 6 illustrates a flow diagram of a method for determining the direction of a target object using an antenna switching pattern, according to one embodiment.
- FIG. 7 shows a circular antenna array with two wireless radios, according to one embodiment.
- FIG. 8 illustrates a flow diagram of a method for determining the direction of a target object using two wireless radios, according to one embodiment.
- FIG. 9 illustrates a diagram for reducing a multipath effect of signals on a circular antenna array, according to one embodiment.
- FIG. 10 illustrates a diagram of the field of view (FoV) for deciding whether to rotate the video capturing device, according to one embodiment.
- FIG. 11 illustrates a diagram of two rotary systems tracking an object moving from a first position to a second position, according to one embodiment.
- a solution is provided to conveniently capture video highlights in a sports event by tracking movement of a target object and synchronizing rotations of a video capturing device (e.g., a mobile phone) on a rotary station with the movement of the target object.
- the target object includes a wireless module that transmits a wireless signal that can be used to determine the position of the target object.
- Position data describing the position of the target object, or motion data describing the movement of the target object is transmitted to the rotary station, and processed to generate commands to drive the video capturing device (e.g., a mobile phone) mounted on the rotary station, to follow the movement of the target object.
- digital content generally refers to any machine-readable and machine-storable work.
- Digital content can include, for example, a combination of video and audio.
- digital content can be a still image.
- the digital content will be referred to as a “video” but no limitation on the type of digital content that can be analyzed are intended by this terminology.
- FIG. 1 is a block diagram of an environment 100 for tracking a wireless communication device and synchronizing a video capturing device (e.g. a mobile phone) connected to a rotary station with the movement of the wireless communication device, according to one embodiment.
- the embodiment illustrated in FIG. 1 includes a target 160 that transmits a wireless signal using one or more wireless communication protocols (e.g., Bluetooth or Wi-Fi) , a video capturing device 105 mounted on a rotary system 110.
- the rotary system 110 is further mounted on a tripod 150. If the video capturing device 105 is recording a sports game, the target 160 may be inside a sports field 120.
- the target 160 includes a wireless module that transmits a wireless signal using a wireless communication protocol.
- the target 160 transmits a data packet using the wireless communication protocol such that the encoded wireless signal has a specific characteristic (e.g., a sinusoidal signal with a single frequency component) .
- a specific characteristic e.g., a sinusoidal signal with a single frequency component
- a data packet may be generated so that the encoded wireless signal is a sequence of zeros or a sequence of ones.
- the Bluetooth protocol is used throughout, the present disclosure is not limited thereto.
- other wireless protocols such as Wi-Fi, radio-frequency identification (RFID) or Zigbee may be used instead.
- the video capturing device 105 is an electronic device used to capture digital content such as recoding a video clip of a sports event happening on the sports field 120.
- the video capturing device 105 is a mobile phone and includes an image capturing device and a transmitter for transmitting the captured digital content.
- the video capturing device 105 can be, for example, a smart phone, a tablet, a digital single-lens reflex camera (DSLR) , or any other suitable user device or computing device for capturing digital content.
- DSLR digital single-lens reflex camera
- the video capturing device 106 can be remotely triggered by a remote control to capture digital content.
- the video capturing device 105 captures digital content for a predetermined period of time (e.g., 30 seconds) . In other configurations, the video capturing device 105 begins capturing digital content when remotely triggered by the remote control and ends capturing digital content when again remotely triggered by the remote control. In addition, the remote control may have certain function buttons to tag the content being captured, and have certain sensors to incorporate rich content tags such a voice tag. The video capturing device 105 can transmit the captured digital content to a cloud storage service. The video capturing device 105 is mounted on or otherwise connected to the rotary system 110.
- a user may use a mobile device as a video capturing device to record video clips of a sports event and to consume digital content, such as the recorded video clips or highlights thereof.
- the user uses the mobile device to perform functions such as, consuming video clip (s) captured by the video capturing device 105, and video highlights generated from the captured video clip (s) .
- the mobile device of the user can be a smart phone, a tablet, or any other suitable user device or computing device for consuming video clip (s) .
- the mobile device of the user provides a user interface, such as physical and/or on-screen buttons, with which the user may interact with the mobile device to perform functions such as tagging, viewing, sharing, and/or otherwise consuming the video clip (s) .
- the rotary system 110 is configured to drive the video capturing device 105 (e.g., a mobile phone of the user having a digital camera) mounted on the rotary system 110 to follow the movement of the target object 160.
- the rotary system 110 is further described with respect to FIG. 2A and 2B.
- FIG. 2A illustrates a block diagram of the rotary system 100, according to one embodiment.
- the rotary system 110 includes a camera holder 210, a camera steering &servo subsystem (CSSS) 220, and a Bluetooth Low Energy (BLE) direction finding subsystem (BDFS) 230.
- BLE Bluetooth Low Energy
- BDFS Bluetooth Low Energy
- the camera holder 210 provides support for attaching a video capturing device 105, such as, a video camera recorder, to the rotary system 110.
- the camera steering &servo subsystem (CSSS) 220 receives an indication of the direction of the target 160 and moves the camera holder to direct the video capturing device to the identified direction of the target 160.
- FIG. 3A shows a block diagram of the CSSS 220, according to one embodiment.
- the CSSS 220 includes one or more gear sets 310, one or more motors 320, a rechargeable battery 330, and a control circuitry 340.
- the one or more motors rotate the camera holder 210 with one or more degrees of freedom.
- the CSSS 220 includes two motors that provide movement to the camera holder 210 with two degrees of freedom (e.g., panning and tilting) .
- the control circuitry 340 implements the camera steering logic to allow the camera to follow the target. For instance, the control circuitry 340 may keep the target within the field of view (FoV) of the camera.
- the CSSS 320 further includes a gear set 310 and a code wheel to provide a higher precision control to the rotation of the camera holder 310.
- the CSSS 220 further includes a wireless module to enable communication with other devices (e.g., a remote control or a phone or the video capturing device 105) .
- the motors 320 are voltage controlled. That is, the rotational velocity of the motor 320 is directly proportional to the voltage applied to the motor 320. Furthermore, the codewheel provides a one-degree precision, while a gear set provides a 1 ⁇ 36 deceleration. That is, the codewheel combined with a gear set 310 provides a 1/36-degree steering step size.
- the control circuitry 340 controls the steering of the CSSS 220.
- the CSSS is configured to filter the position information of the target object 160 to increase the smoothness of a video captured by the video capturing device 105, while keeping the target object near the center of the field of vide of the video capturing device 105.
- the BLE direction finding subsystem (BDFS) 230 finds the direction of the target 160 based on a wireless signal transmitted by a wireless transceiver of the target 160.
- the wireless signal is a Bluetooth signal.
- FIG. 3B shows a block diagram of the BDFS 230, according to one embodiment.
- the BDFS 230 includes multiple antennas 350, one or two wireless radios 360 (e.g., two Bluetooth Low Energy (BLE) radios) , an antenna switch 370, and a microcontroller 390.
- the BDFS 230 receives power from the battery of the CSSS 220.
- the BDFS 230 includes a battery for powering the components of the BDFS 230.
- Each of the wireless radios 360 are coupled to a subset of antennas.
- the BDFS 230 includes two wireless radios.
- one wireless radio is coupled to a first antenna (e.g., a reference antenna)
- the other wireless radio is coupled to the other antennas.
- the antenna switch 370 controls the wireless radios to switch between antennas. As such, a single radio switch may receive or transmit different wireless signals using different antennas 350.
- the wireless radios 360 receive electrical signals picked up by the antennas 350 and demodulates the electrical signals to extract information carried by the signals.
- the wireless radios 370 may further modulate information to be sent via the antennas and modulates the information to produce a current that can be used to excite one or more antennas to emit radio waves.
- the wireless radios 360 are used in conjunction with the antennas 350 to receive wireless signals from the target object 160 for determining the direction of the target object.
- the wireless radios 360 and the wireless antennas 350 are configured to receive Bluetooth signals.
- one or more radios 360 are used in conjunction with one or more antennas 350 to communicate with other mobile devices.
- a rotary system 110 may communicate with another rotary system 110 via Bluetooth using one wireless radio 360 coupled to one antenna 350.
- the rotary system 110 may also wirelessly communicate with the video capturing device 105.
- the rotary system 110 may obtain properties of the video capturing device 105, such as, field of view information.
- the analog-to-digital converter (ADC) 380 converts an analog signal received from a wireless radio 360 into a digital signal.
- the ADC 380 is a discrete component.
- the ADC is integrated inside the microcontroller 390.
- the microcontroller 390 implements the core control logic (CCL) , the direction finding (DF) algorithm, and the post filtering of the resulting directions.
- the CCL performs admission control of BLE devices by obtaining the BLE device ID information from the BLE chip and processes incoming data from BLE chips that are included in a whitelist.
- the CCL further performs A/D control by indicating the ADC 380 when to sample signals.
- the CCL further performs antenna switch control by indicating the antenna switch 370 to switch among antennas at a certain time and according to a certain antenna switch pattern (ASP) .
- ASP antenna switch pattern
- the DF algorithm performs multiple signal classification and determines the parameters of multiple wavefronts arriving at the antennas 350 from the signal measurements made at each of the antennas.
- the DF algorithm may determine the number of incoming signals, the strength of signals and noise, the cross correlation among directional waveforms, direction of arrival of the incoming signals, and the frequency components of the incoming signals.
- the post filtering removes outliers and smoothes out the jitters from the obtained set of directions.
- the obtained set of directions may be noisy due to system noises and multipath effects of the incoming signal.
- the post filtering may include removing directions that are not possible or unlikely to happen in practice. For instance, directions that correspond to underground locations may be filtered out for the position of a basketball in a game since it is unlikely the basketball will be underground during an actual gameplay. Examples of post filtering include the Kalman filter.
- FIG. 4A illustrates a top and side view of an antenna array arranged in a straight line, according to one embodiment.
- the antenna array of FIG. 4A includes 5 individual antennas arranged in a straight line. In other embodiments, the antenna array may include a different number of antennas.
- the antenna array includes a reference antenna 410 and multiple peripheral antennas 420.
- the reference antenna can be any antenna from the antenna array. In the embodiment of FIG. 4A, the reference antenna is at the center of the antenna array. In some embodiments, the antennas are separated with a distance in the range of ⁇ /4 and ⁇ /2, where ⁇ is the wavelength of the wireless signal being received by the antennas.
- FIG. 4B illustrates a top and side view of an antenna array arranged in a circular pattern, according to one embodiment.
- the antenna array of FIG. 4B includes multiple peripheral antennas 420 arranged in a circular pattern, and a reference antenna 410 arranged in the center of the peripheral antennas 420.
- the antenna array includes an even number of peripheral antennas 420 and are evenly spaced. As such, the reference antenna 410 is in between pairs of peripheral antennas 420.
- FIG. 5A illustrates the direction finding of an incoming signal in a uniform linear antenna array, according to one embodiment.
- the antenna array includes N+1 antennas, each spaced with a distance d.
- the incident angle ⁇ to all antennas are substantially equal.
- the received signals can form a vector as:
- FIG. 5B illustrates the direction finding of an incoming signal in a circular antenna array, according to one embodiment.
- antenna 0 reference antenna 420
- antenna 1 are along the orientation baseline
- the effective incident angle for the k-th antenna is
- the steering vector would be as follows:
- the circular antenna array exhibits better omni-sensitivity to different incoming directions.
- one wireless radio 360 may be shared among multiple antennas 350 that are multiplexed using antenna switch 370.
- the BDFS may have a single wireless radio that can be used in conjunction with any of the antennas in the antenna array 350.
- the BDFS includes a first wireless radio 360 that can be used in conjunction with a first subset of antennas of the antenna array 350, and a second wireless radio 360 that can be used in conjunction with a second subset of antennas of the antenna array 350.
- different schemes may be used to determine the frequency of and phase of an incoming signal to be able to determine the direction of the incoming signal.
- the schemes include frequency compensation, frequency cancellation, and dual radios.
- the frequency compensation method predicts the phase changes for the interval since the sampling at the previous antenna, and compensates the frequency offset of the signal when calculating the phase differences among different antennas.
- An instant phase information is calculated from the in-phase and quadrature components of a sample, and a series of phase changes ( ⁇ ) are obtained. Knowing the sampling rate, a phase change rate may be estimated, with which an estimation of the signal frequency may be obtained when switching to a next antenna.
- the k-th sample at the i-th antenna can be represented as:
- T s is the sampling interval, and is the phase change relative to the reference antenna due to the array geometry.
- ⁇ k (n) represent the phase and omitting the sampling interval T s :
- ⁇ k (n) 2 ⁇ f (t k + n) + ⁇ + ⁇ k (6)
- the frequency of the single may be estimated using the above equations.
- the phase may be represented as:
- ⁇ m (n) 2 ⁇ f (t m + n) + ⁇ + ⁇ m (7)
- the phase difference ⁇ k - ⁇ m can be estimated between the k-th and the m-th antenna.
- the frequency cancelation method alternates the antenna switching pattern (ASP) .
- the reference antenna is sampled between the sampling of consecutive antennas. That is, the reference antenna is sampled between the sampling of a k-th antenna and the sampling of a (k+1) -th antenna.
- a stride-3 ASP (S3-ASP) switching pattern where the reference antenna is sampled every 3 samples, looks as follows:
- ⁇ 0 (n) 2 ⁇ f (t 0 + n) + ⁇ + ⁇ 0 (8)
- ⁇ k (n + T) 2 ⁇ f (t 0 + T + n) + ⁇ + ⁇ k
- ⁇ k (n + 2T) 2 ⁇ f (t 0 + 2T + n) + ⁇ + ⁇ k
- ⁇ 0 (n + 3T) 2 ⁇ f (t 0 + 3T + n) + ⁇ + ⁇ 0
- stride-2 ASP looks as follows:
- the S2-ASP reduces the overall time to finish one round of measurements across all antennas compared to the S3-ASP.
- FIG. 6 illustrates a flow diagram of a method for determining the direction of a target object using an antenna switching pattern, according to one embodiment.
- the reference antenna 410 is selected 610 using antenna switch 370 and the reference antenna is sampled 615 using a wireless radio 360.
- the first peripheral antenna 420 is then selected 620 by the antenna switch 370 and the first peripheral antenna 420 is sampled 625 using the wireless radio 360.
- a set number of samples is obtained for the peripheral antenna 420. For instance, if a S3-ASP is used, the peripheral antenna is sampled twice.
- the reference antenna 410 is then selected 630 and sampled 635.
- the antenna switch 360 selects 650 a next peripheral antenna 420 and steps 625 to 640 are repeated. Based on the obtained sampled signals, a direction finding algorithm is performed 660 to determine the direction of the target object. In some embodiments, after the direction finding algorithm is performed, the process goes back to step 620 and steps 620 to 660 are repeated.
- FIG. 7 shows a circular antenna array with two wireless radios, according to one embodiment.
- the reference antenna 410 (antenna 0) is coupled to a first radio front end (radio front end A) 715A
- the peripheral antennas 420 are coupled to antenna switch 710, which in turn, is coupled to a second radio front end (radio front end B) 715B.
- each of the peripheral antennas 420 are connected to the antenna switch 710 using antenna cables with equal or substantially equal lengths.
- the radio front end A 715A is coupled to a first BLE chip (BLE chip A) 720, which is coupled to a first ADC (ADC A) 725A.
- BLE chip A BLE chip A
- the front end B 715B is coupled to a second BLE chip (BLE chip B) 720B, which is coupled to a second ADC (ADC B) 725.
- the antenna switch 710, radio front ends 715A and 715B, BLE chips 720A and 720B, and ADC 725A and 725B are controlled by the CCL 730.
- the DF algorithm 735 determines the direction of a target object.
- the output of the DF algorithm 735 may be filtered using a direction filter 740 and the output of the direction filter 740 is provided to the CSSS host interface for controlling the orientation of a video capturing device.
- the reference antenna 410 is continuously sampled, whereas the peripheral antennas 420 are sampled according to a linear antenna switch pattern.
- FIG. 8 illustrates a flow diagram of a method for determining the direction of a target object using two wireless radios, according to one embodiment.
- a signal is sampled 810 from the reference antenna 410 using a first wireless radio (i.e., radio front end A 715A) .
- a first peripheral antenna 420 is selected 820 by antenna switch 710 and a signal is sampled 825 from the selected antenna using a second wireless radio (i.e., radio front end B 715B) . If there are more peripheral antennas to sample, as determined in step 830, the antenna switch 710 then selects 840 a next peripheral antenna 420 of the antenna array. After the next peripheral antenna 420 is selected steps 810 to 830 are repeated using the selected peripheral antenna.
- the reference antenna is sampled 810 using the first wireless radio and the selected next antenna is sampled using the second wireless radio. Otherwise, if all the peripheral antennas have been sampled, a direction finding algorithm is performed 830 using the signals sampled by the first and second wireless radios.
- FIG. 9 illustrates a diagram for reducing a multipath effect of signals on a circular antenna array, according to one embodiment.
- neighboring antennas may affect the performance of the direction finding algorithm. For instance, in the circular antenna array of FIG. 9, antennas located in the front based on the direction of the incoming signal may receive more energy from the direct path of the target device, whereas antennas located in the back may receive a weaker signal and will likely include energy from a reflected path.
- each group is based on three antennas that are locate along a diameter of the circle formed by the peripheral antennas. That is, each group is based on two peripheral antennas located opposite to each other and the reference antenna.
- a first group (group A) is formed based on antennas 0, 1, and 4; a second group is formed based on antennas 0, 2, and 5; and a third group (group C) is formed based on antennas 0, 3, and 6.
- group A is formed based on antennas 0, 1, and 4
- a second group is formed based on antennas 0, 2, and 5
- a third group is formed based on antennas 0, 3, and 6.
- each set of three antennas the groups are based on split the remaining peripheral antennas into two sets.
- each of the groups include the set of three antennas the group is based on and one of the two sets of remaining peripheral antennas.
- the first group (group A) includes antennas 0, 1, 2, 3, and 4; the second group includes antennas 0, 2, 3, 4, and 5; and the third group (group C) includes antennas 0, 3, 4, 5, and 6.
- the first group includes antennas 0, 1, 4, 5, and 6; the second group includes antennas 0, 1 2, 5, and 6; and the third group includes antennas 0, 1, 2, 3, and 6.
- Each group of antennas has a range of directions for which the direction finding algorithm is more accurate than the other groups.
- group A is more accurate at determining the direction of a target object located at direction 910 compared to other groups.
- the direction of the target object may be determined using each antenna group and a direction is selected based on the accuracy of each of the groups for identifying a target object located at each of the determined directions. That is, a first direction for the target object is determined using the first antenna group, a second direction is determined using the second antenna group, and a third direction is determined using the third antenna group. Then an accuracy for the first group to determine a target object in the first direction is determined.
- an accuracy for the second group and the third group to determine a target object located in the second and third direction, respectively, is determined. Based on the accuracy determined for each of the antenna groups, a single direction for the target object is selected. That is, if the accuracy of the first group for identifying an object located in the first direction is greater than the accuracy of the second group for identifying an object located in the second direction and the accuracy of the third group for identifying an object located in the third direction, then the first direction is selected as the direction of the target object.
- a single direction is identified for the target object using every antenna of the antenna array, and a determination is made whether the signal characteristics of the sampled signals agree the expected signal characteristics to be sampled at each antenna based on the determined direction of the target object. That is, if the identified direction of the target object is direction 910, a determination is made whether the signals sampled for antennas 2 and 3 have a higher energy than the other antennas. Similarly, if the identified direction of the target object is direction 920, a determination is made whether the signal sampled for antennas 4 and 5 have higher energy than the other antennas. If a determination is made that the characteristics of the sampled signals do not agree with the expected characteristics of those signals, the identified direction is considered to be inaccurate. As such, the identified direction may be filtered out.
- the camera steering control logic uses a “dead zone” when deciding whether to rotate the video capturing device 105, as shown in FIG. 10.
- the dead zone spans ⁇ degrees to the left and right side of the current camera orientation. As long as the target object 160 stays within the dead zone, the video capturing device 105 is not rotated by the rotary system 110.
- the dead zone is a quarter of the field of view (FoV) of the video capturing device 105.
- the camera steering control logic further uses an “overshooting zone” that spans degrees to the left and right side of the FoV of the video capturing device 105 after being rotated by the rotary system 110.
- the camera steering control logic identifies an orientation of the video capturing device 105 so that the target object 160 is within the overshooting zone.
- the camera steering control logic identifies the orientation to rotate the video capturing device to by identifying a zone surrounding the target object 160 having a size equal to the overshooting zone.
- a video capturing device orientation that is within the identified zone is the determined as the orientation the video capturing device 105 is to be rotated to.
- the camera steering control logic further controls the speed of the rotation of the video capturing device 105 based on the speed of the target object 160.
- the rotational speed of the video capturing device 105 is linearly proportional to the speed of the target object 105.
- the rotational speed of the video capturing device 105 is an exponential function of the speed of the target object 160.
- the rotary system 110 may track the movement of more than target object. For instance, the rotary system 110 may track the movement of multiple player in a game of soccer. In another example, the rotary system 110 may track the movement of one or more players and the ball that is currently in play. In this scenario, the different target objects being tracked may be prioritized as not every target object can be within the FoV of the video capturing device 105 at all times. For instance, the rotary system 110 may prioritize the tracking of a ball or a specific player in a game. If multiple target objects are prioritized, the rotary system 110 may select a direction that increases the number of tracked objects that are within the FoV of the video capturing device 105.
- the rotary system may further select a direction that reduces the amount of rotation of the video capturing device.
- the rotation system determines a score that is based on the number target objects 160 that would be within the FoV of the video capturing device 105 and the amount of rotation to position the video capturing device 105 in that direction. The rotary system 110 then selects the orientation with the largest score as the orientation to rotate the video capturing deice 105.
- the direction of the target object 160 may be determined using a single rotary system 105. If multiple rotary systems 105 are used, the position of the target object 160 may be determined.
- FIG. 11 shows a diagram of two rotary systems 110A and 110B tracking an object 160 moving from position 1110A to position 1110B, according to one embodiment. Using rotary system 110A, angles ⁇ 1 and ⁇ 2 can be found. Similarly, using rotary system 110B, angles ⁇ 1 and ⁇ 2 can be found. If the position of the rotary systems 110A and 110B is known, the positions 1110A and 1110B can be determined.
- the distance d between the first position 1110A and the second position 1110B can be found as:
- the distance d between the first position 1110A and the second position 1110B may be determined using an inertial measurement unit.
- the position of the target object may be found without knowledge of the relative position of the rotary systems 110.
- Modules may constitute either software modules (e.g., code embodied on a machine-readable medium or in a transmission signal) or hardware modules.
- a hardware module is tangible unit capable of performing certain operations and may be configured or arranged in a certain manner.
- one or more computer systems e.g., a standalone, client or server computer system
- one or more hardware modules of a computer system e.g., a processor or a group of processors
- software e.g., an application or application portion
- a hardware module may be implemented mechanically or electronically.
- a hardware module may comprise dedicated circuitry or logic that is permanently configured (e.g., as a special-purpose processor, such as a field programmable gate array (FPGA) or an application-specific integrated circuit (ASIC)) to perform certain operations.
- a hardware module may also comprise programmable logic or circuitry (e.g., as encompassed within a general-purpose processor or other programmable processor) that is temporarily configured by software to perform certain operations. It will be appreciated that the decision to implement a hardware module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.
- processors may be temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions.
- the modules referred to herein may, in some example embodiments, comprise processor-implemented modules.
- the one or more processors may also operate to support performance of the relevant operations in a “cloud computing” environment or as a “software as a service” (SaaS) .
- SaaS software as a service
- at least some of the operations may be performed by a group of computers (as examples of machines including processors) , these operations being accessible via a network (e.g., the Internet) and via one or more appropriate interfaces (e.g., application program interfaces (APIs) . )
- APIs application program interfaces
- the performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines.
- the one or more processors or processor-implemented modules may be located in a single geographic location (e.g., within a home environment, an office enviromnent, or a server farm) . In other example embodiments, the one or more processors or processor-implemented modules may be distributed across a number of geographic locations.
- any reference to “one embodiment” or “an embodiment” means that a particular element, feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment.
- the appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment.
- Coupled and “connected” along with their derivatives.
- some embodiments may be described using the term “coupled” to indicate that two or more elements are in direct physical or electrical contact.
- the term “coupled, ” may also mean that two or more elements are not in direct contact with each other, but yet still co-operate or interact with each other.
- the embodiments are not limited in this context.
- the terms “comprises, ” “comprising, ” “includes, ” “including, ” “has, ” “having” or any other variation thereof, are intended to cover a non-exclusive inclusion.
- a process, method, article, or apparatus that comprises a list of elements is not necessarily limited to only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
- “or” refers to an inclusive or and not to an exclusive or.
- condition A or B is satisfied by any one of the following: A is true (or present) and B is false (or not present) , A is false (or not present) and B is true (or present) , and both A and B are true (or present) .
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Variable-Direction Aerials And Aerial Arrays (AREA)
Abstract
Description
Claims (20)
- An object tracking device for tracking the movement of a target object, the object tracking device comprising:a direction finding subsystem for determining a direction of the target object, the direction finding subsystem comprising:a plurality of peripheral antennas,a reference antenna,an antenna switch coupled to the plurality of peripheral antennas, the antenna switch for selecting one of the plurality of peripheral antennas,a first wireless radio coupled to the antenna switch, the first wireless radio for receiving a wireless signal from the target object via a peripheral antenna selected by the antenna switch,a second wireless radio coupled to the reference antennal, the second wireless radio for receiving the wireless signal from the target object via the reference antenna, anda processing unit for controlling the antenna switch and determining the direction of the target object based on signals sampled by the first wireless radio and the second wireless radio; anda steering subsystem coupled to the direction finding subsystem, the steering subsystem comprising:a holder for attaching a video capturing device to the steering subsystem, one or more motors for rotating the holder, anda control unit for receiving the direction of the target object determined by the direction finding subsystem and controlling the one or more motors based on the received direction of the target object.
- The object tracking device of claim 1, wherein the plurality of peripheral antennas are arranged in a circle, and wherein the reference antenna is in a center of the circle.
- The object tracking device of claim 1, wherein the processing unit is configured to:control the antenna array to select a first peripheral antenna of the plurality of peripheral antennas;control the first wireless radio to sample a first wireless signal using the first peripheral antenna, and control the second wireless radio to sample the first wireless signal using the reference antenna; andresponsive to the first and second wireless radio sampling the first wireless signal:control the antenna array to select a second peripheral antenna of the plurality of peripheral antennas, andcontrol the first wireless radio to sample a second wireless signal using the second peripheral antenna, and control the second wireless radio to sample the second wireless signal using the reference antenna.
- The object tracking device of claim 3, wherein the processing unit if further configured to:determine a first phase difference between the first wireless signal sampled by the first peripheral antenna and the reference antenna;determine a second phase difference between the second wireless signal sampled by the second peripheral antenna and the reference antenna; anddetermine a direction of a target object that transmitted the first wireless signal and the second wireless signal based on (1) the first determined phase difference between the first peripheral antenna and the reference antenna, and (2) the second determined phase difference between the second peripheral antenna and the reference antenna.
- The object tracking device of claim 1, wherein the wireless signal is a Bluetooth signal.
- The object tracking device of claim 5, wherein the wireless signal corresponds to one of(l) a stream of zeros, and (2) a stream of ones.
- The object tracking device of claim 1, wherein the control unit of the steering subsystem is configured to:determine whether the direction of the target object is within a field of view of the video capturing device; andresponsive to determining that the direction of the target object is not within the field of view of the video capturing device:rotate the video capturing device so that the direction of the target object is within the field of view of the video capturing device.
- The object tracking device of claim 1, wherein the control unit of the steering subsystem is configured to:determine whether the direction of the target object is within a first range; andresponsive to determining that the direction of the target object is not within the first range:identify a second range from a plurality of preset ranges based on the direction of the target object, andactuate the one or more rotors based on a value associated with the second range.
- A direction finding device for determining a direction of the target object, the direction finding device comprising:a plurality of peripheral antennas,a reference antenna,an antenna switch coupled to the plurality of peripheral antennas, the antenna switch for selecting one of the plurality of peripheral antennas;a first wireless radio coupled to the antenna switch, the first wireless radio for receiving a wireless signal from the target object via a peripheral antenna selected by the antenna switch;a second wireless radio coupled to the reference antennal, the second wireless radio for receiving the wireless signal from the target object via the reference antenna.a processing unit for controlling the antenna switch and determining the direction of the target object based on signals sampled by the first wireless radio and the second wireless radio.
- The direction finding device of claim 9, wherein the plurality of peripheral antennas are arranged in a circle, and wherein the reference antenna is in a center of the circle.
- The direction finding device of claim 10, wherein the processing unit is configured to:control the antenna array to select a first peripheral antenna of the plurality of peripheral antennas;control the first wireless radio to sample a first wireless signal using the first peripheral antenna, and control the second wireless radio to sample the first wireless signal using the reference antenna; andresponsive to the first and second wireless radio sampling the first wireless signal:control the antenna array to select a second peripheral antenna of the plurality of peripheral antennas, andcontrol the first wireless radio to sample a second wireless signal using the second peripheral antenna, and control the second wireless radio to sample the second wireless signal using the reference antenna.
- The direction finding device of claim 11, wherein the processing unit if further configured to:determine a phase difference between the first wireless signal sampled by the first peripheral antenna and the reference antenna;determine a phase difference between the second wireless signal sampled by the second peripheral antenna and the reference antenna; anddetermine a direction of a target object that transmitted the first wireless signal and the second wireless signal based on (1) the determined phase difference between the first peripheral antenna and the reference antenna, and (2) the determined phase difference between the second peripheral antenna and the reference antenna.
- The direction finding device of claim 9, wherein the wireless signal is a Bluetooth signal.
- The direction finding device of claim 13, wherein the wireless signal corresponds to one of (l) a stream of zeros, and (2) a stream of ones.
- A computer-implemented method comprising:selecting a first peripheral antenna of a plurality of peripheral antennas;sampling a first wireless signal using the first peripheral antenna;sampling the first wireless signal using a reference antenna;determining a phase difference between the first wireless signal sampled by the first peripheral antenna and first wireless signal sampled by the reference antenna;selecting a second peripheral antenna of the plurality of peripheral antennas;sampling a second wireless signal using the second peripheral antenna;sampling the second wireless signal using the reference antenna;determining a phase difference between the second wireless signal sampled by the second peripheral antenna and the second wireless signal sampled by the reference antenna; anddetermining a direction of a target object that transmitted the first wireless signal and the second wireless signal based on (1) the determined phase difference between the first wireless signal sampled by first peripheral antenna and the first wireless signal sampled by the reference antenna, and (2) the determined phase difference between the second wireless signal sampled by the second peripheral antenna and the second wireless signal sampled by the reference antenna.
- The computer-implemented method of claim 15, wherein the plurality of peripheral antennas are arranged in a circle, and wherein the reference antenna is in a center of the circle.
- The computer-implemented method of claim 15, wherein the wireless signal is a Bluetooth signal.
- The computer-implemented method of claim 15, wherein the wireless signal corresponds to one of (l) a stream of zeros, and (2) a stream of ones.
- The computer-implemented method of claim 15, further comprising:determining whether the direction of the target object is within a field of view of a video capturing device; andresponsive to determining that the direction of the target object is not within the field of view of the video capturing device:rotating the video capturing device so that the direction of the target object is within the field of view of the video capturing device.
- The computer-implemented method of claim 15, further comprising:determining whether the direction of the target object is within a first range; andresponsive to determining that the direction of the target object is not within the first range:identifying a second range from a plurality of preset ranges based on the direction of the target object, andactuating a plurality of rotors based on a value associated with the second range.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2017/083588 WO2018205133A1 (en) | 2017-05-09 | 2017-05-09 | Direction finding of wireless communication devices |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2017/083588 WO2018205133A1 (en) | 2017-05-09 | 2017-05-09 | Direction finding of wireless communication devices |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018205133A1 true WO2018205133A1 (en) | 2018-11-15 |
Family
ID=64104259
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2017/083588 WO2018205133A1 (en) | 2017-05-09 | 2017-05-09 | Direction finding of wireless communication devices |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2018205133A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115997430A (en) * | 2020-08-25 | 2023-04-21 | 上海诺基亚贝尔股份有限公司 | Relative phase determination for frequency drift compensation |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140253741A1 (en) * | 2013-03-08 | 2014-09-11 | Panasonic Corporation | Camera system and switching device |
CN104766106A (en) * | 2015-03-05 | 2015-07-08 | 桑田智能工程技术(上海)有限公司 | Intelligent tracking and monitoring system based on RFID label signal intensity |
CN105676865A (en) * | 2016-04-12 | 2016-06-15 | 北京博瑞爱飞科技发展有限公司 | Target tracking method, device and system |
CN106023251A (en) * | 2016-05-16 | 2016-10-12 | 西安斯凯智能科技有限公司 | Tracking system and tracking method |
CN106584516A (en) * | 2016-11-01 | 2017-04-26 | 河池学院 | Intelligent photographing robot for tracing specified object |
-
2017
- 2017-05-09 WO PCT/CN2017/083588 patent/WO2018205133A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140253741A1 (en) * | 2013-03-08 | 2014-09-11 | Panasonic Corporation | Camera system and switching device |
CN104766106A (en) * | 2015-03-05 | 2015-07-08 | 桑田智能工程技术(上海)有限公司 | Intelligent tracking and monitoring system based on RFID label signal intensity |
CN105676865A (en) * | 2016-04-12 | 2016-06-15 | 北京博瑞爱飞科技发展有限公司 | Target tracking method, device and system |
CN106023251A (en) * | 2016-05-16 | 2016-10-12 | 西安斯凯智能科技有限公司 | Tracking system and tracking method |
CN106584516A (en) * | 2016-11-01 | 2017-04-26 | 河池学院 | Intelligent photographing robot for tracing specified object |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115997430A (en) * | 2020-08-25 | 2023-04-21 | 上海诺基亚贝尔股份有限公司 | Relative phase determination for frequency drift compensation |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105393079B (en) | Depth transducer control based on context | |
US10721384B2 (en) | Camera with radar system | |
EP3354007B1 (en) | Video content selection | |
JP6297278B2 (en) | Real-time radio frequency signal visualization device | |
US10942252B2 (en) | Tracking system and tracking method | |
US20160309095A1 (en) | Methods and apparatus for capturing images using multiple camera modules in an efficient manner | |
CN107852447B (en) | Balancing exposure and gain at an electronic device based on device motion and scene distance | |
US20220236359A1 (en) | Cooperative automatic tracking | |
US10938102B2 (en) | Search track acquire react system (STARS) drone integrated acquisition tracker (DIAT) | |
US20090304374A1 (en) | Device for tracking a moving object | |
CN101027900A (en) | System and method for the production of composite images comprising or using one or more cameras for providing overlapping images | |
CN107404615B (en) | Image recording method and electronic equipment | |
US20130286049A1 (en) | Automatic adjustment of display image using face detection | |
US11050972B1 (en) | Systems and methods for generating time-lapse videos | |
GB2463703A (en) | Estimating the direction in which a camera is pointing as a photograph is taken | |
CN109348170B (en) | Video monitoring method and device and video monitoring equipment | |
CN108370412B (en) | Control device, control method, and recording medium | |
CN111213365A (en) | Shooting control method and controller | |
WO2018205133A1 (en) | Direction finding of wireless communication devices | |
CN113784041A (en) | Automatic tracking photography holder and method based on UWB | |
US20170019585A1 (en) | Camera clustering and tracking system | |
CN115150748B (en) | Indoor positioning method, system, electronic equipment and storage medium | |
Fang et al. | Person tracking and identification using cameras and Wi-Fi Channel State Information (CSI) from smartphones: dataset | |
CN107979731B (en) | Method, device and system for acquiring audio and video data | |
CN109155818B (en) | Head rotation tracking device for video highlight recognition |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17909102 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 17909102 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 16.03.2020) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 17909102 Country of ref document: EP Kind code of ref document: A1 |