WO2018076572A1 - 一种目标跟踪方法及目标跟踪装置、存储介质 - Google Patents

一种目标跟踪方法及目标跟踪装置、存储介质 Download PDF

Info

Publication number
WO2018076572A1
WO2018076572A1 PCT/CN2017/073119 CN2017073119W WO2018076572A1 WO 2018076572 A1 WO2018076572 A1 WO 2018076572A1 CN 2017073119 W CN2017073119 W CN 2017073119W WO 2018076572 A1 WO2018076572 A1 WO 2018076572A1
Authority
WO
WIPO (PCT)
Prior art keywords
target
electronic device
position information
angle
relative position
Prior art date
Application number
PCT/CN2017/073119
Other languages
English (en)
French (fr)
Inventor
唐矗
孙晓路
陈子冲
卿明
吴庆
任冠佼
蒲立
Original Assignee
纳恩博(北京)科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 纳恩博(北京)科技有限公司 filed Critical 纳恩博(北京)科技有限公司
Priority to EP17863668.4A priority Critical patent/EP3410062A4/en
Priority to US16/078,087 priority patent/US20190049549A1/en
Publication of WO2018076572A1 publication Critical patent/WO2018076572A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • G01S13/72Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
    • G01S13/723Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar by using numerical data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/74Systems using reradiation of radio waves, e.g. secondary radar systems; Analogous systems
    • G01S13/82Systems using reradiation of radio waves, e.g. secondary radar systems; Analogous systems wherein continuous-type signals are transmitted
    • G01S13/825Systems using reradiation of radio waves, e.g. secondary radar systems; Analogous systems wherein continuous-type signals are transmitted with exchange of information between interrogator and responder
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S3/00Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
    • G01S3/78Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using electromagnetic waves other than radio waves
    • G01S3/782Systems for determining direction or deviation from predetermined direction
    • G01S3/785Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system
    • G01S3/786Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system the desired condition being maintained automatically
    • G01S3/7864T.V. type tracking systems
    • G01S3/7865T.V. type tracking systems using correlation of the live video image with a stored image
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • G01S5/0205Details
    • G01S5/0226Transmitters
    • G01S5/0231Emergency, distress or locator beacons
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • G01S5/0257Hybrid positioning
    • G01S5/0258Hybrid positioning by combining or switching between measurements derived from different systems
    • G01S5/02585Hybrid positioning by combining or switching between measurements derived from different systems at least one of the measurements being a non-radio measurement
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • G01S5/0294Trajectory determination or predictive filtering, e.g. target tracking or Kalman filtering
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/18Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using ultrasonic, sonic, or infrasonic waves
    • G01S5/28Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using ultrasonic, sonic, or infrasonic waves by co-ordinating position lines of different shape, e.g. hyperbolic, circular, elliptical or radial
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0094Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G05D1/028Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using a RF signal
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/12Target-seeking control
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • G06T7/248Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/292Multi-camera tracking
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • B64U2101/31UAVs specially adapted for particular uses or applications for imaging, photography or videography for surveillance
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/20Remote controls
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory

Definitions

  • the present invention relates to the field of electronic technologies, and in particular, to a target tracking method, a target tracking device, and a storage medium.
  • robots have been widely used and applied to various fields.
  • ground robots eg, balance cars
  • aerial robots eg, drones
  • material transportation disaster relief, terrain exploration, power inspection, film and television shooting, etc.
  • the robot can track the target.
  • the robot mostly uses the camera to collect the image data of the target, and uses the visual tracking algorithm to determine the relative position of the target and the robot, and then tracks the target travel based on the relative position.
  • this method has poor stability, is susceptible to interference from illumination changes, and is less robust.
  • the embodiment of the present invention solves the technical problem that the target tracking method in the prior art has poor stability and low robustness by providing a target tracking method, a target tracking device, and a storage medium.
  • an embodiment of the present invention provides a target tracking method, which is applied to an electronic device, where the electronic device is provided with a camera and a carrierless communication module, and the method includes:
  • the first relative position information includes: a first distance between the electronic device and the tracked target and/or a first angle between a traveling direction of the electronic device and the tracked target;
  • the second relative position information includes: a second distance between the electronic device and the tracked target and/or a second angle between a traveling direction of the electronic device and the tracked target;
  • the third relative position information includes a third distance between the electronic device and the tracked target and/or a third angle between the traveling direction of the electronic device and the tracked target.
  • the determining, by the carrierless communication module, the first relative location information of the tracked target and the electronic device including:
  • the first relative position information is determined by sensing the position of the beacon by the carrierless communication module; wherein the beacon is set on the target of the tracking.
  • the determining, by the camera, the second relative location information of the tracked target and the electronic device includes:
  • the second relative position information is determined using the target template based on a visual tracking algorithm.
  • the determining, in the image data, the target template corresponding to the target of the tracking comprising:
  • P t is the size of the target template in the subsequent frame image
  • P 0 is the size of the target template in the initial frame image
  • d 0 is determined at an initial moment corresponding to the initial frame image.
  • a first distance d t is the first distance determined at a subsequent moment corresponding to the subsequent frame image, the subsequent frame image and the initial frame image belong to the image data, and the subsequent frame image is located After the initial frame image.
  • the determining, according to the first relative location information and the second relative location information, the third relative location information of the tracked target and the electronic device including:
  • d is the third distance
  • d uwb is the first distance
  • ⁇ v is a pitch angle of the camera
  • is the third angle
  • ⁇ uwb is the first angle
  • ⁇ vision is the second angle
  • is a constant of 0 to 1, for adjusting the weights of ⁇ vision and ⁇ uwb .
  • the controlling, according to the third relative location information, the electronic device to track the target travel including at least one of the following:
  • the traveling speed of the electronic device is adjusted based on the third distance.
  • the present invention provides the following technical solutions through an embodiment of the present invention:
  • an embodiment of the present invention provides a target tracking device, which is applied to an electronic device, where the electronic device is provided with a camera and a carrierless communication module, and the target tracking device includes:
  • a first determining unit configured to determine, by the carrierless communication module, first tracking information of the target and the electronic device
  • a second determining unit configured to determine, by the camera, the target of the tracking and the electrical Second relative position information of the child device
  • a third determining unit configured to determine, according to the first relative location information and the second relative location information, third tracking location information of the tracked target and the electronic device;
  • control unit configured to control the electronic device to track the target travel based on the third relative position information.
  • the first relative position information includes: a first distance between the electronic device and the tracked target and/or a first angle between a traveling direction of the electronic device and the tracked target;
  • the second relative position information includes: a second distance between the electronic device and the tracked target and/or a second angle between a traveling direction of the electronic device and the tracked target;
  • the third relative position information includes a third distance between the electronic device and the tracked target and/or a third angle between the traveling direction of the electronic device and the tracked target.
  • the first determining unit includes:
  • a first determining subunit configured to sense a location of the beacon by the carrierless communication module to determine the first relative location information; wherein the beacon is set on the tracked target.
  • the second determining unit comprises:
  • Obtaining a subunit configured to acquire image data collected by the camera
  • a second determining subunit configured to determine, in the image data, a target template corresponding to the tracked target
  • a third determining subunit configured to determine the second relative position information by using the target template based on a visual tracking algorithm.
  • the second determining subunit is specifically configured to:
  • P t is the size of the target template in the subsequent frame image
  • P 0 is the size of the target template in the initial frame image
  • d 0 is determined at an initial moment corresponding to the initial frame image.
  • a first distance d t is the first distance determined at a subsequent moment corresponding to the subsequent frame image, the subsequent frame image and the initial frame image belong to the image data, and the subsequent frame image is located After the initial frame image.
  • the third determining unit includes:
  • control unit comprises at least one of the following:
  • a first adjustment subunit configured to adjust a traveling direction of the electronic device based on the third angle, such that the electronic device travels toward the tracked target;
  • a second adjustment subunit configured to adjust a tilt angle of the camera based on the third angle, such that the camera is aligned with the tracked target
  • a third adjustment subunit configured to adjust a travel speed of the electronic device based on the third distance.
  • an embodiment of the present invention provides a target tracking device, which is applied to an electronic device, and includes a memory and a processor, where the executable device stores executable instructions, where the executable instructions are used to perform the embodiments of the present invention.
  • Target tracking method is applied to an electronic device, and includes a memory and a processor, where the executable device stores executable instructions, where the executable instructions are used to perform the embodiments of the present invention.
  • an embodiment of the present invention provides a non-volatile storage medium, which is executable to be stored.
  • the executable instruction is used to execute the target tracking method provided by the embodiment of the present invention.
  • the electronic device since the electronic device is equipped with the carrierless communication module and the camera at the same time, the first relative position information determined by the carrierless communication module and the second relative position information determined by the camera are fused, and the accuracy is higher.
  • the third relative position information is used to control the electronic device to track the target travel by using the third relative position information, so that the target tracking method in the prior art has a technical problem of poor stability and low robustness, and the target is improved. The technical effect of tracking the stability and robustness of the method.
  • FIG. 1 is a flowchart of a target tracking method according to an embodiment of the present invention.
  • FIG. 2 is a schematic diagram of a target tracking method applied to a balance vehicle according to an embodiment of the present invention, wherein FIG. 2 is a side view, and FIG. 3 is a top view;
  • FIG. 4 is a schematic diagram of a target tracking method applied to a drone according to an embodiment of the present invention.
  • FIG. 5 is a structural diagram of a target tracking apparatus according to an embodiment of the present invention.
  • the embodiment of the present invention solves the technical problem that the target tracking method in the prior art has poor stability and low robustness by providing a target tracking method and a target tracking device.
  • the technical solution of the embodiment of the present invention is to solve the above technical problem, and provide a target tracking method, which is applied to an electronic device, where the electronic device is provided with a camera and a carrierless communication module, and the method includes: adopting the carrierless The communication module determines first target position information of the tracked target and the electronic device; determining, by the camera, the target of the tracking and the electronic device a second relative position information; determining, based on the first relative position information and the second relative position information, third relative position information of the tracked target and the electronic device; based on the third relative position Information that controls the electronic device to track the target travel.
  • the term "and/or" appearing in this article is merely an association relationship describing an associated object, indicating that there may be three relationships, for example, A and / or B, which may indicate that A exists separately, and A exists at the same time. And B, there are three cases of B alone.
  • the character "/" in this article generally indicates that the contextual object is an "or" relationship.
  • the embodiment provides a target tracking method, which is applied to an electronic device, and the electronic device may be: a ground robot (for example, a balance vehicle), or a drone (for example, a multi-rotor drone, or a fixed wing without
  • a ground robot for example, a balance vehicle
  • a drone for example, a multi-rotor drone, or a fixed wing without
  • the device is not limited to the specific embodiment of the device.
  • a camera and a carrierless communication module are disposed on the electronic device, and the carrierless communication module may be a UWB (Ultra Wideband) communication module, and the UWB is a great communication technology.
  • the difference communication technology does not need to use the carrier to transmit signals, but transmits and receives data by transmitting and receiving extremely narrow pulses with nanosecond or sub-second order. It has strong anti-interference performance, no carrier, low transmission power of equipment, etc. Features, can be used for precise positioning, the distance accuracy can reach about 10 cm.
  • the wireless carrier communication module of the embodiment of the present invention is not limited to the UWB communication module, and any wireless carrier communication module that can be used for target tracking in the actual application should belong to the protection scope of the embodiment of the present invention.
  • a camera and a UWB communication module can be mounted on the joystick of the balance car.
  • the electronic device is an unmanned aerial vehicle, and can be installed under the drone.
  • the camera and the UWB communication module wherein the UWB communication module is a high-power UWB communication module, which has a long detection distance and is suitable for a drone that flies at a high altitude.
  • the target tracking method provided in this embodiment includes:
  • Step S101 Determine, by the carrierless communication module, the first relative position information of the tracked target and the electronic device.
  • step S101 includes:
  • the first relative position information is determined by sensing a position of the beacon by the carrierless communication module; wherein the beacon is set on the target of the tracking.
  • the first relative position information includes: a first distance between the electronic device and the tracked target and/or a first angle between the traveling direction of the electronic device and the tracked target.
  • the tracking target carries the UWB beacon
  • the UWB beacon is provided with a receiving antenna
  • the UWB communication module is provided with a transmitting antenna
  • the UWB communication module can sense the position of the UWB beacon, thereby determining the electronic device and The first distance between the tracked targets and/or the first angle of the direction of travel of the electronic device with the tracked target.
  • the receiving antenna of the UWB beacon does not have the polarization direction of the transmitting antenna in the UWB communication module (for example, the UWB beacon attitude changes)
  • the accuracy of the first angle obtained is greatly reduced, and the tracking direction is prone to occur. .
  • Step S102 Determine, by the camera, the second relative position information of the tracked target and the electronic device.
  • step S102 includes:
  • the second relative position information includes: a second distance between the electronic device and the tracked target and/or a second angle between the traveling direction of the electronic device and the tracked target.
  • a camera or a plurality of cameras may be disposed on the electronic device, and image data including the target is collected by the camera. Then determining a target template corresponding to the tracked target in a certain frame image (ie, an initial frame image), and then using a visual tracking algorithm to calculate a second distance between the electronic device and the tracked target and/or the electronic device The second angle between the direction of travel and the target being tracked.
  • the visual tracking algorithm may be any short-term tracking algorithm, which is not specifically limited herein.
  • the initial frame image collected by the camera may be displayed on the display screen, and the user's selection operation is acquired, and then the target template is determined from the initial frame image based on the selection operation.
  • the target template may also be determined by a method such as saliency detection or object detection.
  • a model may be trained to track the target in the subsequent frame image according to the target template corresponding to the tracked target defined in the initial frame image, and the model is continuously updated during the tracking process to Achieve the goal of adapting to the change of the target object's attitude and overcoming complex background interference. Since offline training is not required, the method is highly versatile and can track any object specified by the user; in addition, the accuracy of the second angle calculated based on the visual tracking calculation is high. However, since the actual application scenarios are often very complex (for example, changes in illumination, interference from similar targets), the robustness of the visual tracking algorithm is not good, and the calculated second distance is easily affected and cannot be applied to the product application level. It is also impossible to accurately determine whether the target is lost or not.
  • the determining, in the image data, the target template corresponding to the tracked target includes:
  • the size of the target template in the subsequent frame image is determined based on equation (1):
  • P t is the size of the target template in the subsequent frame image
  • P 0 is the size of the target template in the initial frame image
  • d 0 is the first distance determined at the initial moment corresponding to the initial frame image
  • d t is subsequent The first distance determined by the subsequent moment corresponding to the frame image, the subsequent frame image and the initial frame image belong to the image data, and the subsequent frame image is located after the initial frame image.
  • the size P 0 of the target template is recorded, and the UWB communication module is recorded at time t 0 .
  • a first distance d 0 between the determined electronic device and the tracked target at time t 1 after the time t 0 (ie: subsequent time), using a visual tracking algorithm from subsequent frame images (ie: after the initial frame)
  • the size of the target template corresponding to the target of the tracking in the frame image P t The size of the target template corresponding to the target of the tracking in the frame image P t :
  • the size of the target template and the position of the target template in the image need to be considered, and the size of the target template can reflect the distance of the tracked target distance from the electronic device (generally, the target template is more Large, the closer the distance; the smaller the target template, the farther the distance is. If the target template size is not accurate, it will affect the accuracy of the second relative position information. Since the first distance measured by the UWB communication module is highly accurate, the first distance measured by the UWB communication module is used to correct the size of the target template, and then the size of the corrected target template (ie, P t ) is utilized.
  • the visual tracking algorithm determines the second relative position information, which can greatly improve the accuracy of the second relative position information and improve the accuracy of the visual tracking algorithm.
  • Step S103 Determine the third relative position information of the tracked target and the electronic device based on the first relative position information and the second relative position information.
  • the third relative position information includes: a third distance between the electronic device and the tracked target and/or a third angle between the traveling direction of the electronic device and the tracked target.
  • the first relative position information determined by the UWB communication module can also be used to assist the visual tracking algorithm to determine whether the target is lost, thereby solving the problem.
  • the visual tracking algorithm cannot accurately determine whether the target is missing or not.
  • the accuracy of the first relative location information or the second relative location information may drop sharply, seriously affecting the accuracy of the final third relative location information.
  • a confidence level may be separately set for the first relative position information and the second relative position information for respectively indicating the degree of trust of the first relative position information and the second relative position information. If the first relative position information and the second relative position information are consistent, the first relative position information and the second relative position information are directly merged to obtain the third relative position information without considering the confidence; If the difference between the location information and the second relative location information is large, the information with higher confidence is decisive; or, when the confidence of a certain information is too low, it is not directly considered.
  • the interference may be interfered by other wireless signals in the environment, and the determined first relative position information is not accurate enough, and the degree of credibility is not high. Therefore, a confidence level is set here for the first relative position information obtained at each moment to indicate the degree of trust of the first relative position information obtained at each moment.
  • the waveform signal (abbreviation: original waveform signal) when the UWB communication module senses the UWB beacon is saved for standby, at time t i after the time t 0 .
  • the waveform signal (abbreviated as the i-th waveform signal) sensed by the UWB communication module at time t i is compared with the original waveform signal, and the ith waveform signal and the original waveform signal are calculated.
  • the similarity is the confidence of the first relative position information acquired at time t i .
  • the greater the similarity the higher the reliability of the first relative position acquired at time t i , that is, the higher the confidence; the smaller the similarity, the more reliable the first relative position acquired at time t i
  • the second relative position information when the second relative position information is determined based on the visual tracking algorithm, the situation may be lost.
  • the target in the target template is not the original tracking target, and the determined second relative position information is inaccurate and reliable. Not very high. Therefore, the second relative position information obtained for each frame image is set with a confidence level to indicate the degree of confidence of the second relative position information obtained for each frame image.
  • the image in the target template in the initial frame image (abbreviation: original image) may be saved for backup, and when the second location information is determined for the subsequent frame image, the image in the target template in the subsequent frame image is acquired (referred to as: Subsequent image), and the subsequent image is compared with the original image, and the similarity between the subsequent image and the original image is calculated, and the similarity is used as the confidence of the second position information determined based on the subsequent frame image.
  • the greater the degree of similarity the higher the reliability of the second location information determined based on the subsequent frame image, that is, the higher the confidence; the smaller the similarity, the second location information determined based on the subsequent frame image.
  • step S103 will be exemplified by taking the electronic device as a balance vehicle as an example.
  • the third distance only the first distance can be considered.
  • the third distance can be determined based on equation (2):
  • d is the third distance
  • d uwb is the first distance
  • the camera can be rotated up and down
  • ⁇ v is the pitch angle of the camera
  • the camera elevation angle is used to approximate the first angle ⁇ uwb
  • the first angle ⁇ Uwb can also be calculated based on signal transmission between the UWB communication module (mounted on the balance car) and the UWB beacon (mounted on the tracked target).
  • a third angle between the traveling direction of the balance vehicle and the target of the tracking may be determined based on equation (3):
  • is related to the performance of the UWB communication module. If the performance of the UWB communication module is better, the higher the accuracy of the measured ⁇ uwb , the greater the weight of ⁇ uwb ; if the performance of the UWB communication module is worse, the measured ⁇ uwb lower the accuracy, the smaller the weight ⁇ uwb weight.
  • the electronic device is taken as an unmanned aerial vehicle as an example, and step S103 is exemplified.
  • the distance of the tracked target to the horizontal direction of the drone ie, the third distance
  • the second angle error determined by the visual tracking algorithm is large, and the confidence is too low, so Not be considered.
  • the distance of the tracked target to the horizontal direction of the drone ie, the third distance
  • equation (4) the distance of the tracked target to the horizontal direction of the drone
  • d is the third distance
  • d uwb is the first distance
  • ⁇ uwb is the first angle. Since the UWB communication module is fixed on the drone, the UAV attitude tilt angle can be approximated as the first angle ⁇ Uwb , of course, the first angle ⁇ uwb can also be calculated based on the signal transmission between the UWB communication module (mounted on the drone) and the UWB beacon (mounted on the tracked target).
  • the performance of the UWB communication module can also be considered, and the first angle and the second clip are adjusted based on the performance of the UWB communication module.
  • the weight of the corner For example, the direction of travel of the drone and the pitch angle of the tracked target (ie, the third angle) can be determined based on equation (3):
  • is related to the performance of the UWB communication module. If the performance of the UWB communication module is better, the weight of ⁇ uwb is larger; if the performance of the UWB communication module is worse, the weight of ⁇ uwb is smaller.
  • Step S104 The control electronic device tracks the target travel based on the third relative location information.
  • step S103 includes: adjusting a traveling direction of the electronic device based on the third angle, so that the electronic device travels toward the tracked target; and/or adjusting the camera's tilt based on the third angle
  • the depression angle makes the camera aim at the target of the tracking.
  • the traveling direction of the balance car can be adjusted based on the third angle, so that the balance car travels toward the target of the tracking, and at the same time, the balance of the camera on the balance car is adjusted based on the third angle.
  • the depression angle makes the camera aim at the tracking target, ensuring that the tracking target is always at the center of the image captured by the camera.
  • the flight direction of the drone can be adjusted based on the third angle (the drone angle, the yaw angle, and the roll angle of the drone can be adjusted to The flight direction of the machine is adjusted), so that the drone can fly toward the target of the tracking.
  • the tilt angle of the camera on the drone can be adjusted based on the third angle, so that the camera is aimed at the target of the tracking, and the target of the tracking is always located. The center of the image taken by the camera.
  • step S103 further comprising: adjusting a traveling speed of the electronic device based on the third distance.
  • the traveling speed of the balance car can be adjusted based on the third distance, wherein the traveling speed of the balance car is proportional to the third distance, and the balance is larger when the third distance is larger.
  • the greater the speed of the car the shorter the distance between the balance car and the tracked target can be shortened to prevent the loss; the smaller the third distance is, the smaller the speed of the balance car is to avoid the balance car and the tracking target. collision.
  • the flight speed of the drone can be adjusted based on the third distance, wherein the flight speed of the drone is proportional to the third distance, and the third distance is larger.
  • the distance between the drone and the tracked target can be shortened to prevent the loss; in the third distance, the flight speed of the drone is smaller. To avoid collisions between the drone and the tracked target.
  • the traveling speed of the electronic device based on the size of the target template corresponding to the tracked target.
  • the size of the target template changes in subsequent frame images.
  • the traveling speed (or flight speed) of the electronic device is inversely proportional to the size of the target template, and the sigmoid function can be used to formulate the velocity decay model. , so that the electronic device quickly decelerates when approaching the target object to avoid collision.
  • a target tracking method is disclosed, which is applied to an electronic device, where the electronic device is provided with a camera and a carrierless communication module, and the method includes: determining a tracking target by using a carrierless communication module. a first relative position information of the electronic device; determining, by the camera, the second relative position information of the tracked target and the electronic device; determining, according to the first relative position information and the second relative position information, the third relative position of the tracked target and the electronic device Information; based on the third relative location information, the control electronic device tracks the target travel.
  • the electronic device Since the electronic device is equipped with the carrierless communication module and the camera at the same time, the first relative position information determined by the carrierless communication module and the second relative position information determined by the camera are merged, and the third relative position information with higher accuracy is obtained, and then The third relative position information is used to control the electronic device to track the target travel, so the technical problem of poor stability and low robustness of the target tracking method in the prior art is effectively solved, and the stability of the target tracking method is improved. And robust technical effects.
  • the embodiment provides a target tracking device, which is applied to an electronic device.
  • the electronic device is provided with a camera and a carrierless communication module.
  • the target tracking device includes:
  • the first determining unit 201 is configured to determine the tracked target and the electronic device through the carrierless communication module First relative position information of the device;
  • the second determining unit 202 is configured to determine, by the camera, the second relative position information of the tracked target and the electronic device;
  • the third determining unit 203 is configured to determine third target position information of the tracked target and the electronic device based on the first relative position information and the second relative position information;
  • the control unit 204 is configured to control the electronic device to track the target travel based on the third relative location information.
  • the first relative position information includes: a first distance between the electronic device and the tracked target and/or a first angle between the traveling direction of the electronic device and the tracked target;
  • the second relative position information includes: a second distance between the electronic device and the tracked target and/or a second angle between the traveling direction of the electronic device and the tracked target;
  • the third relative position information includes a third distance between the electronic device and the tracked target and/or a third angle between the traveling direction of the electronic device and the tracked target.
  • the first determining unit 201 includes:
  • the first determining subunit is configured to sense the location of the beacon through the carrierless communication module to determine the first relative location information; wherein the beacon is set on the tracked target.
  • the second determining unit 202 includes:
  • Obtaining a subunit configured to acquire image data collected by a camera
  • a second determining subunit configured to determine, in the image data, a target template corresponding to the tracked target
  • a third determining sub-unit configured to determine the second relative position information using the target template based on a visual tracking algorithm.
  • the second determining subunit is specifically configured to:
  • P t is the size of the target template in the subsequent frame image
  • P 0 is the size of the target template in the initial frame image
  • d 0 is determined at an initial moment corresponding to the initial frame image.
  • a first distance d t is the first distance determined at a subsequent moment corresponding to the subsequent frame image, the subsequent frame image and the initial frame image belong to the image data, and the subsequent frame image is located After the initial frame image.
  • the third determining unit 203 includes:
  • control unit 204 includes at least one of the following:
  • a first adjustment subunit configured to adjust a traveling direction of the electronic device based on the third angle so that the electronic device travels toward the tracked target
  • a second adjustment subunit configured to adjust a tilt angle of the camera based on the third angle, so that the camera is aligned with the target of tracking
  • the third adjustment subunit is configured to adjust a travel speed of the electronic device based on the third distance.
  • An embodiment of the present invention provides a target tracking device, which is applied to an electronic device, and includes a memory and a processor.
  • the memory stores executable instructions, and the executable instructions are used to execute the target described in the first embodiment of the present invention. Tracking method.
  • Embodiments of the present invention provide a non-volatile storage medium storing executable instructions, The executable instruction is used to execute the target tracking method described in the first embodiment of the present invention.
  • the storage medium may be provided in one or a combination of flash memory, disk storage, CD-ROM, optical storage.
  • the electronic device introduced in this embodiment is an electronic device used in the method for performing information processing in the embodiment of the present invention
  • those skilled in the art can understand the present implementation based on the information processing method introduced in the embodiment of the present invention.
  • the electronic device used in the method of information processing in the embodiments of the present invention by those skilled in the art is within the scope of the present invention.
  • a target tracking device which is applied to an electronic device.
  • the electronic device is provided with a camera and a carrierless communication module.
  • the target tracking device includes: a first determining unit configured to pass no The carrier communication module determines the first relative position information of the tracked target and the electronic device; the second determining unit is configured to determine, by the camera, the tracked target and the second relative position information of the electronic device; the third determining unit is configured to be based on the first Determining the tracked target and the third relative position information of the electronic device with the relative position information and the second relative position information; and the control unit is configured to control the electronic device to track the target travel based on the third relative position information.
  • the electronic device Since the electronic device is equipped with the carrierless communication module and the camera at the same time, the first relative position information determined by the carrierless communication module and the second relative position information determined by the camera are merged, and the third relative position information with higher accuracy is obtained, and then The third relative position information is used to control the electronic device to track the target travel, so the technical problem of poor stability and low robustness of the target tracking method in the prior art is effectively solved, and the stability of the target tracking method is improved. And robust technical effects.
  • embodiments of the present invention can be provided as a method, system, or computer program product.
  • the present invention can be implemented in an entirely hardware embodiment, fully software implemented For example, or in combination with an embodiment of software and hardware aspects.
  • the invention can take the form of a computer program product embodied on one or more computer-usable storage media (including but not limited to disk storage, CD-ROM, optical storage, etc.) including computer usable program code.
  • the computer program instructions can also be stored in a computer readable memory that can direct a computer or other programmable data processing device to operate in a particular manner, such that the instructions stored in the computer readable memory produce an article of manufacture comprising the instruction device.
  • the apparatus implements the functions specified in one or more blocks of a flow or a flow and/or block diagram of the flowchart.
  • These computer program instructions can also be loaded onto a computer or other programmable data processing device such that a series of operational steps are performed on a computer or other programmable device to produce computer-implemented processing for execution on a computer or other programmable device.
  • the instructions provide steps for implementing the functions specified in one or more of the flow or in a block or blocks of a flow diagram.
  • the first relative position information of the tracked target and the electronic device is determined by the carrierless communication module; the second relative position information of the tracked target and the electronic device is determined by the camera; and the first relative position information and the second Relative position information, determining a third relative position information of the tracked target and the electronic device; and controlling the electronic device to track the target travel based on the third relative position information.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Electromagnetism (AREA)
  • Studio Devices (AREA)
  • Image Analysis (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

本发明公开了一种目标跟踪方法,应用于电子设备中,所述电子设备上设置有摄像头和无载波通信模块,所述方法包括:通过所述无载波通信模块确定跟踪的目标与所述电子设备的第一相对位置信息;通过所述摄像头确定所述跟踪的目标与所述电子设备的第二相对位置信息;基于所述第一相对位置信息和所述第二相对位置信息,确定所述跟踪的目标与所述电子设备的第三相对位置信息;基于所述第三相对位置信息,控制所述电子设备跟踪所述目标行进。本发明实现了提高目标跟踪方法的稳定性和鲁棒性的技术效果。同时,本发明还公开了一种目标跟踪装置和存储介质。

Description

一种目标跟踪方法及目标跟踪装置、存储介质 技术领域
本发明涉及电子技术领域,尤其涉及一种目标跟踪方法及目标跟踪装置、存储介质。
背景技术
目前,机器人已经广泛普及,且应用于各个领域。例如,地面机器人(例如:平衡车)可用于代步出行,治安巡逻,等等;空中机器人(例如:无人机)可用于物资运输,灾难救援,地形勘探,电力巡检,影视拍摄,等等。
机器人可以对目标进行跟踪,机器人在跟踪目标行进过程中,大多利用摄像头采集目标的图像数据,并利用视觉跟踪算法确定目标与机器人的相对位置,再基于该相对位置跟踪该目标行进。但是这种方法稳定性差,容易受到光照变化的干扰,鲁棒性较低。
发明内容
本发明实施例通过提供一种目标跟踪方法及目标跟踪装置、存储介质,解决了现有技术中的目标跟踪方法存在稳定性差,鲁棒性较低的技术问题。
第一方面,本发明实施例提供一种目标跟踪方法,应用于电子设备中,所述电子设备上设置有摄像头和无载波通信模块,所述方法包括:
通过所述无载波通信模块确定跟踪的目标与所述电子设备的第一相对位置信息;
通过所述摄像头确定所述跟踪的目标与所述电子设备的第二相对位置信息;
基于所述第一相对位置信息和所述第二相对位置信息,确定所述跟踪 的目标与所述电子设备的第三相对位置信息;
基于所述第三相对位置信息,控制所述电子设备跟踪所述目标行进。
优选地,所述第一相对位置信息包括:所述电子设备与所述跟踪的目标之间的第一距离和/或所述电子设备的行进方向与所述跟踪的目标的第一夹角;
所述第二相对位置信息包括:所述电子设备与所述跟踪的目标之间的第二距离和/或所述电子设备的行进方向与所述跟踪的目标的第二夹角;
所述第三相对位置信息包括:所述电子设备与所述跟踪的目标之间的第三距离和/或所述电子设备的行进方向与所述跟踪的目标的第三夹角。
优选地,所述通过所述无载波通信模块确定跟踪的目标与所述电子设备的第一相对位置信息,包括:
通过所述无载波通信模块感应信标的位置,从而确定所述第一相对位置信息;其中,所述信标设置在所述跟踪的目标上。
优选地,所述通过所述摄像头确定所述跟踪的目标与所述电子设备的第二相对位置信息,包括:
获取所述摄像头采集的图像数据;
在所述图像数据中确定所述跟踪的目标对应的目标模板;
基于视觉跟踪算法,利用所述目标模板确定所述第二相对位置信息。
优选地,所述在所述图像数据中确定所述跟踪的目标对应的目标模板,包括:
基于等式
Figure PCTCN2017073119-appb-000001
确定后续帧图像中的所述目标模板的大小;
其中,Pt为所述后续帧图像中的所述目标模板的大小,P0为初始帧图像中的所述目标模板的大小,d0为在所述初始帧图像对应的初始时刻确定的所述第一距离,dt为在所述后续帧图像对应的后续时刻确定的所述第一距离,所述后续帧图像和所述初始帧图像属于所述图像数据,且所述后续帧 图像位于所述初始帧图像之后。
优选地,所述基于所述第一相对位置信息和所述第二相对位置信息,确定所述跟踪的目标与所述电子设备的第三相对位置信息,包括:
基于等式d=duwb·cosθv,确定所述电子设备与所述跟踪的目标的第三距离:
其中,d为所述第三距离,duwb为所述第一距离,θv为所述摄像头的俯仰角;
基于等式θ=σ·θvision+(1-σ)·θuwb,确定所述电子设备的行进方向与所述跟踪的目标的第三夹角:
其中,θ为所述第三夹角;θuwb为所述第一夹角;θvision为所述第二夹角;σ为0至1的常数,用于调整θvision和θuwb的权重。
优选地,所述基于所述第三相对位置信息,控制所述电子设备跟踪所述目标行进,包括以下至少一项:
基于所述第三夹角,调节所述电子设备的行进方向,使得所述电子设备朝向所述跟踪的目标行进;
基于所述第三夹角,调节所述摄像头的仰俯角,使得所述摄像头对准所述跟踪的目标;
基于所述第三距离,调节所述电子设备的行进速度。
另一方面,本发明通过本发明的一实施例,提供如下技术方案:
第二方面,本发明实施例提供一种目标跟踪装置,应用于电子设备中,所述电子设备上设置有摄像头和无载波通信模块,所述目标跟踪装置,包括:
第一确定单元,配置为通过所述无载波通信模块确定跟踪的目标与所述电子设备的第一相对位置信息;
第二确定单元,配置为通过所述摄像头确定所述跟踪的目标与所述电 子设备的第二相对位置信息;
第三确定单元,配置为基于所述第一相对位置信息和所述第二相对位置信息,确定所述跟踪的目标与所述电子设备的第三相对位置信息;
控制单元,配置为基于所述第三相对位置信息,控制所述电子设备跟踪所述目标行进。
优选地,所述第一相对位置信息包括:所述电子设备与所述跟踪的目标之间的第一距离和/或所述电子设备的行进方向与所述跟踪的目标的第一夹角;
所述第二相对位置信息包括:所述电子设备与所述跟踪的目标之间的第二距离和/或所述电子设备的行进方向与所述跟踪的目标的第二夹角;
所述第三相对位置信息包括:所述电子设备与所述跟踪的目标之间的第三距离和/或所述电子设备的行进方向与所述跟踪的目标的第三夹角。
优选地,所述第一确定单元,包括:
第一确定子单元,配置为通过所述无载波通信模块感应信标的位置,从而确定所述第一相对位置信息;其中,所述信标设置在所述跟踪的目标上。
优选地,所述第二确定单元,包括:
获取子单元,配置为获取所述摄像头采集的图像数据;
第二确定子单元,配置为在所述图像数据中确定所述跟踪的目标对应的目标模板;
第三确定子单元,配置为基于视觉跟踪算法,利用所述目标模板确定所述第二相对位置信息。
优选地,所述第二确定子单元,具体配置为:
基于等式
Figure PCTCN2017073119-appb-000002
确定后续帧图像中的所述目标模板的大小;
其中,Pt为所述后续帧图像中的所述目标模板的大小,P0为初始帧图像 中的所述目标模板的大小,d0为在所述初始帧图像对应的初始时刻确定的所述第一距离,dt为在所述后续帧图像对应的后续时刻确定的所述第一距离,所述后续帧图像和所述初始帧图像属于所述图像数据,且所述后续帧图像位于所述初始帧图像之后。
优选地,所述第三确定单元,包括:
第四确定子单元,配置为基于等式d=duwb·cosθv,确定所述电子设备与所述跟踪的目标的第三距离:其中,d为所述第三距离,duwb为所述第一距离,θv为所述摄像头的俯仰角;
第五确定子单元,配置为基于等式θ=σ·θvision+(1-σ)·θuwb,确定所述电子设备的行进方向与所述跟踪的目标的第三夹角:其中,θ为所述第三夹角;θuwb为所述第一夹角;θvision为所述第二夹角;σ为0至1的常数,用于调整θvision和θuwb的权重。
优选地,所述控制单元,包括以下至少一项:
第一调节子单元,配置为基于所述第三夹角,调节所述电子设备的行进方向,使得所述电子设备朝向所述跟踪的目标行进;
第二调节子单元,配置为基于所述第三夹角,调节所述摄像头的仰俯角,使得所述摄像头对准所述跟踪的目标;
第三调节子单元,配置为基于所述第三距离,调节所述电子设备的行进速度。
本发明实施例中提供的一个或多个技术方案,至少具有如下技术效果或优点:
第三方面,本发明实施例提供一种目标跟踪装置,应用于电子设备中,包括存储器和处理器,所述存储器中存储有可执行指令,所述可执行指令用于执行本发明实施例提供的目标跟踪方法。
第四方面,本发明实施例提供一种非易失性存储介质,存储有可执行 指令,所述可执行指令用于执行本发明实施例提供的目标跟踪方法。
在本发明实施例中,由于电子设备同时搭载无载波通信模块和摄像头,将无载波通信模块确定的第一相对位置信息和摄像头确定的第二相对位置信息进行融合,获得了准确性更高的第三相对位置信息,再利用第三相对位置信息控制电子设备跟踪目标行进,所以有效地解决了现有技术中的目标跟踪方法存在稳定性差,鲁棒性较低的技术问题,实现了提高目标跟踪方法的稳定性和鲁棒性的技术效果。
附图说明
为了更清楚地说明本发明实施例中的技术方案,下面将对实施例描述中所需要使用的附图作一简单地介绍,显而易见地,下面描述中的附图是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为本发明实施例中一种目标跟踪方法的流程图;
图2-图3为本发明实施例中目标跟踪方法应用于平衡车的示意图,其中,图2为侧视图,图3为俯视图;
图4为本发明实施例中目标跟踪方法应用于无人机的示意图;
图5为本发明实施例中一种目标跟踪装置的结构图。
具体实施方式
本发明实施例通过提供一种目标跟踪方法及目标跟踪装置,解决了现有技术中的目标跟踪方法存在稳定性差,鲁棒性较低的技术问题。
本发明实施例的技术方案为解决上述技术问题,提供一种目标跟踪方法,应用于电子设备中,所述电子设备上设置有摄像头和无载波通信模块,所述方法包括:通过所述无载波通信模块确定跟踪的目标与所述电子设备的第一相对位置信息;通过所述摄像头确定所述跟踪的目标与所述电子设 备的第二相对位置信息;基于所述第一相对位置信息和所述第二相对位置信息,确定所述跟踪的目标与所述电子设备的第三相对位置信息;基于所述第三相对位置信息,控制所述电子设备跟踪所述目标行进。
为了更好的理解上述技术方案,下面将结合说明书附图以及具体的实施方式对上述技术方案进行详细的说明。
首先说明,本文中出现的术语“和/或”,仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。另外,本文中字符“/”,一般表示前后关联对象是一种“或”的关系。
实施例一
本实施例提供了一种目标跟踪方法,应用于电子设备中,所述电子设备可以是:地面机器人(例如:平衡车)、或无人机(例如:多旋翼无人机、或固定翼无人机)、或电动汽车等设备,此处,对于所述电子设备具体是何种设备,本实施例不做具体限定。
在具体实施过程中,在电子设备上设置有摄像头和无载波通信模块,所述无载波通信模块具体可以是超宽带(UWB,Ultra Wideband)通信模块,UWB是一种与传统通信技术有极大差异的通信技术,不需要使用载波来传输信号,而是通过发送和接收具有纳秒或亚秒级以下的极窄脉冲来传递数据,其具有抗干扰性能强、无载波、设备发射功率低等特点,可以用于精确定位,其距离精度能够达到10厘米左右。当然,本发明实施例的无线载波通信模块也不仅限于UWB通信模块,实际应用中任何能用于目标跟踪的无线载波通信模块应当都属于本发明实施例的保护范围。
如图2所示,以所述电子设备是平衡车为例,可以在平衡车的操控杆上安装摄像头和UWB通信模块。
如图4所示,以所述电子设备是无人机为例,可以在无人机下方安装 摄像头和UWB通信模块,其中,该UWB通信模块为大功率UWB通信模块,具有较远的检测距离,适用于高空飞行的无人机。
进一步,如图1所示,本实施例提供的目标跟踪方法,包括:
步骤S101:通过无载波通信模块确定跟踪的目标与电子设备的第一相对位置信息。
作为一种可选的实施例,步骤S101,包括:
通过无载波通信模块感应信标的位置,从而确定第一相对位置信息;其中,所述信标设置在所述跟踪的目标上。
在具体实施过程中,第一相对位置信息包括:电子设备与跟踪的目标之间的第一距离和/或电子设备的行进方向与跟踪的目标的第一夹角。
在具体实施过程中,跟踪的目标携带有UWB信标,UWB信标内设置有接收天线,UWB通信模块内设置有发射天线,通过UWB通信模块可以感应到UWB信标的位置,从而确定电子设备与跟踪的目标之间的第一距离和/或电子设备的行进方向与跟踪的目标的第一夹角。但是,在UWB信标的接收天线与UWB通信模块内的发射天线极化方向不一致时(例如:在UWB信标姿态变化),获取的第一夹角的精度将大幅下降,容易出现跟踪方向的摆动。
步骤S102:通过摄像头确定跟踪的目标与电子设备的第二相对位置信息。
作为一种可选的实施例,步骤S102,包括:
获取摄像头采集的图像数据;在图像数据中确定跟踪的目标对应的目标模板;基于视觉跟踪算法,利用目标模板确定第二相对位置信息。
在具体实施过程中,第二相对位置信息包括:电子设备与跟踪的目标之间的第二距离和/或电子设备的行进方向与跟踪的目标的第二夹角。
在具体实施过程中,可以在电子设备上设置一个摄像头或多个摄像头(例如:两个或两个以上的摄像头),通过摄像头采集包含目标的图像数据, 然后在其中的某一帧图像(即:初始帧图像)中确定跟踪的目标对应的目标模板,再利用视觉跟踪算法计算出电子设备与跟踪的目标之间的第二距离和/或电子设备的行进方向与跟踪的目标的第二夹角。其中,所述视觉跟踪算法可以是任何短期跟踪(short-term tracking)算法,此处不做具体限定。
另外,在确定跟踪的目标对应的目标模板时,可以将摄像头采集到的初始帧图像通过显示屏显示出来,并获取用户的选择操作,再基于该选择操作从初始帧图像中确定所述目标模板;或者,还可以利用显著性分析(saliency detection)或目标检测(object detection)等方法确定所述目标模板。
在具体实施过程中,基于视觉跟踪算法,可以根据初始帧图像中定义的跟踪的目标对应的目标模板,训练一模型对后续帧图像中的该目标进行跟踪,并在跟踪过程不断更新模型,以达到适应目标物体姿态变化,以及克服复杂背景干扰的目的。由于无需离线训练,该方法具有很高的通用性,可以对用户指定的任何物体进行跟踪;此外,基于视觉跟踪算计算出的第二夹角准确性很高。但是,由于实际应用场景往往十分复杂(例如:光照的变化,相似目标的干扰),因此视觉跟踪算法的鲁棒性不好,计算出的第二距离容易受到影响,无法达到产品应用级别,此外,也无法准确地判断目标是否跟丢。
作为一种可选的实施例,所述在图像数据中确定跟踪的目标对应的目标模板,包括:
基于等式(1)确定后续帧图像中的目标模板的大小:
Figure PCTCN2017073119-appb-000003
其中,Pt为后续帧图像中的目标模板的大小,P0为初始帧图像中的目标模板的大小,d0为在初始帧图像对应的初始时刻确定的第一距离,dt为在后续帧图像对应的后续时刻确定的第一距离,后续帧图像和初始帧图像属 于图像数据,且后续帧图像位于初始帧图像之后。
举例来讲,在t0时刻(即:初始时刻),在从初始帧图像中确定跟踪的目标对应的目标模板后,记录该目标模板的大小P0,以及记录在t0时刻通过UWB通信模块确定的电子设备与跟踪的目标之间的第一距离d0;在t0时刻之后的t1时刻(即:后续时刻),在利用视觉跟踪算法从后续帧图像(即:初始帧之后的任一帧)中确定电子设备与跟踪的目标的第二相对位置信息时,获取通过UWB通信模块确定的电子设备与跟踪的目标之间的第一距离dt;并基于等式(1)确定后续帧图像中的跟踪的目标对应的目标模板的大小Pt
Figure PCTCN2017073119-appb-000004
由于,在利用视觉跟踪算法确定第二相对位置信息时,需要考虑目标模板的大小和目标模板在图像中的位置,目标模板的大小能够反映跟踪的目标距离电子设备的远近(一般,目标模板越大,距离越近;目标模板越小,距离越远),若目标模板大小不准确,则会影响第二相对位置信息的准确性。由于UWB通信模块测量到的第一距离准确性很高,此处利用UWB通信模块测量到的第一距离来修正目标模板的大小,再基于修正后的目标模板的大小(即:Pt)利用视觉跟踪算法来确定第二相对位置信息,可以大大提升第二相对位置信息的准确性,提升视觉跟踪算法的精度。
步骤S103:基于第一相对位置信息和第二相对位置信息,确定跟踪的目标与电子设备的第三相对位置信息。
在具体实施过程中,由于通过UWB通信获得测得的第一相对位置信息和通过视觉跟踪算法获得的第二相对位置信息都存在优点和不足(一般来讲,UWB通信模块获得的第一距离要比视觉跟踪算法获得的第二距离更准确,而视觉跟踪算法获得的第二夹角又要比UWB通信模块获得的第一夹角更加准确),所以,此处将第一相对位置信息和第二相对位置信息相融合(对 于具体融合方式,此处不做限定),目的在于获取更准确的第三相对位置信息。
在具体实施过程中,第三相对位置信息包括:电子设备与跟踪的目标之间的第三距离和/或电子设备的行进方向与跟踪的目标的第三夹角。
另外,在利用UWB通信模块对目标进行跟踪时,是不会出现跟丢的情况的,所以UWB通信模块确定的第一相对位置信息还可以用于辅助视觉跟踪算法来判定目标是否丢失,从而解决了视觉跟踪算法无法准确地判断目标是否跟丢的问题。
在具体实施过程中,在某些不利情况下,第一相对位置信息或第二相对位置信息的准确性可能会急剧下降,严重影响最终的第三相对位置信息的准确性。为了避免这种情况,可以对第一相对位置信息和第二相对位置信息分别设置置信度,用于分别表示第一相对位置信息和第二相对位置信息的可信程度。若第一相对位置信息和第二相对位置信息一致性较高,则不考虑置信度,直接将第一相对位置信息和第二相对位置信息进行融合,获得第三相对位置信息;若第一相对位置信息和第二相对位置信息差异较大,则置信度较高的信息的起决定性作用;或者,在某一信息的置信度太低时,直接不予考虑。
其中,在通过UWB通信模块感应UWB信标的位置来确定第一相对位置信息时,可能会受到环境中其它无线信号的干扰,导致确定的第一相对位置信息不够准确,可信程度不高。所以,此处针对每一时刻获得的第一相对位置信息都设置一置信度,来表示每一时刻获得的第一相对位置信息的可信程度。具体来讲,可以在t0时刻确定第一相对位置信息时,保存UWB通信模块感应UWB信标时的波形信号(简称:原始波形信号)以作备用,在t0时刻之后的ti时刻,在通过UWB通信模块感应UWB信标的位置时,将ti时刻UWB通信模块感应到的波形信号(简称:第i波形信号)与原始波形信号做比对,计算出第i波形信号与原始波形信号的相似度,将该相似 度作为在ti时刻获取的第一相对位置信息的置信度。其中,相似度越大,则说明在ti时刻获取的第一相对位置可信程度越高,即置信度越高;相似度越小,则说明在ti时刻获取的第一相对位置可信程度越低,即置信度越低。
其中,在基于视觉跟踪算法确定第二相对位置信息时,可能会出现跟丢的情况,此时目标模板内的目标并不是原始的跟踪目标,导致确定的第二相对位置信息不准确,可信程度不高。所以,此处针对每一帧图像获得的第二相对位置信息都设置一置信度,来表示每一帧图像获得的第二相对位置信息的可信程度。具体来讲,可以保存初始帧图像中目标模板内的图像(简称:原始图像)以作备用,在对后续帧图像确定第二位置信息时,获取后续帧图像中目标模板内的图像(简称:后续图像),并将后续图像与原始图像做比对,计算出后续图像与原始图像的相似度,将该相似度作为基于后续帧图像确定的第二位置信息的置信度。其中,相似度越大,则说明基于后续帧图像确定的第二位置信息的可信程度越高,即置信度越高;相似度越小,则说明基于后续帧图像确定的第二位置信息的可信程度越低,即置信度越低。
下面,以所述电子设备为平衡车为例,对步骤S103进行举例说明。
如图2所示,在对第一距离和第二距离进行比较时,发现第二距离与第一距离差异较大,而第一距离的置信度更高,则在确定平衡车与跟踪的目标的第三距离时,可以只考虑了第一距离。例如,可以基于等式(2)确定第三距离:
d=duwb·cosθv         (2)
其中,d为第三距离,duwb为第一距离,摄像头可以上下转动,θv为摄像头的俯仰角,此处,使用摄像头仰角来近似第一夹角θuwb,当然,第一夹角θuwb也可以基于UWB通信模块(装配在平衡车上)和UWB信标(装配在跟踪的目标上)的之间的信号传输来计算得出。
如图3所示,在确定平衡车的行进方向与跟踪的目标的第三夹角时, 还可以考虑UWB通信模块的性能,基于UWB通信模块的性能越调节第一夹角和第二夹角的权重。例如,可以基于等式(3)确定平衡车的行进方向与跟踪的目标的第三夹角:
θ=σ·θvision+(1-σ)·θuwb      (3)
其中,θ为第三夹角;θuwb为第一夹角;θvision为第二夹角;σ为0至1的常数,用于调整θvision和θuwb的权重。σ与UWB通信模块的性能有关,若UWB通信模块的性能越好,其测得的θuwb准确性越高,则θuwb的权重越大;若UWB通信模块的性能越差,其测得的θuwb准确性越低,则θuwb的权重越小。
下面,以所述电子设备为无人机为例,对步骤S103进行举例说明。
如图4所示,在确定跟踪的目标到无人机的水平方向的距离(即:第三距离)时,视觉跟踪算法所确定的第二夹角误差较大,其置信度太低,所以不予考虑。例如,可以基于等式(4)计算跟踪的目标到无人机的水平方向的距离(即:第三距离):
d=duwb·cosθuwb        (4)
其中,d为第三距离,duwb为第一距离,θuwb为第一夹角,由于UWB通信模块固定在无人机上,此处可以将无人机姿态倾角近似作为是第一夹角θuwb,当然,第一夹角θuwb也可以基于UWB通信模块(装配在无人机上)和UWB信标(装配在跟踪的目标上)的之间的信号传输来计算得出。
在确定无人机的行进方向与跟踪的目标的仰俯角(即:第三夹角)时,也可以考虑UWB通信模块的性能,基于UWB通信模块的性能越调节第一夹角和第二夹角的权重。例如,可以基于等式(3)确定无人机的行进方向与跟踪的目标的仰俯角(即:第三夹角):
θ=σ·θvision+(1-σ)·θuwb       (3)
其中,θ为第三夹角;θuwb为第一夹角(可以用无人机姿态倾角近似代替);θvision为第二夹角;σ为0至1的常数,用于调整θvision和θuwb的 权重。σ与UWB通信模块的性能有关,若UWB通信模块的性能越好,则θuwb的权重越大;若UWB通信模块的性能越差,则θuwb的权重越小。
步骤S104:基于第三相对位置信息,控制电子设备跟踪所述目标行进。
作为一种可选的实施例,步骤S103,包括:基于第三夹角,调节电子设备的行进方向,使得电子设备朝向跟踪的目标行进;和/或,基于第三夹角,调节摄像头的仰俯角,使得摄像头对准跟踪的目标。
举例来讲,在所述电子设备是平衡车时,可以基于第三夹角调节平衡车的行进方向,使得平衡车朝向跟踪的目标行进,同时,基于第三夹角调节平衡车上摄像头的仰俯角,使得摄像头对准跟踪的目标,保证跟踪的目标始终位于摄像头拍摄的图像的中心。
举例来讲,在所述电子设备是无人机时,可以基于第三夹角调节无人机的飞行方向(可以通过调节无人机的仰俯角、偏航角、滚转角,来对无人机的飞行方向进行调节),使得无人机朝向跟踪的目标飞行,同时,还可以基于第三夹角调节无人机上摄像头的仰俯角,使得摄像头对准跟踪的目标,保证跟踪的目标始终位于摄像头拍摄的图像的中心。
作为一种可选的实施例,步骤S103,还包括:基于第三距离调节电子设备的行进速度。
举例来讲,在所述电子设备是平衡车时,可以基于第三距离调节平衡车的行驶速度,其中,平衡车的行驶速度与第三距离为正比例关系,在第三距离越大时,平衡车的行驶速度越大,从而可以缩短平衡车与跟踪的目标之间的距离,防止跟丢;在第三距离越小时,则平衡车的行驶速度越小,以避免平衡车与跟踪的目标发生碰撞。
举例来讲,在所述电子设备是无人机时,可以基于第三距离调节无人机的飞行速度,其中,无人机的飞行速度与第三距离为正比例关系,在第三距离越大时,无人机的飞行速度越大,从而可以缩短无人机与跟踪的目标之间的距离,防止跟丢;在第三距离越小时,则无人机的飞行速度越小, 以避免无人机与跟踪的目标发生碰撞。
另外,还可以基于跟踪的目标对应的目标模板的大小,调节电子设备的行驶速度。具体来讲,在初始帧图像中确定的目标模板,在后续帧图像中该目标模板的大小会发生变化。一般,电子设备与跟踪的目标的距离越近,则目标模板的尺寸越大,此处可以定义电子设备的行进速度(或飞行速度)反比于目标模板的尺寸,可以采用sigmoid函数制定速度衰减模型,使电子设备在接近目标物体的时候快速减速,以避免发生碰撞。
上述本发明实施例中的技术方案,至少具有如下的技术效果或优点:
在本发明实施例中,公开了一种目标跟踪方法,应用于电子设备中,所述电子设备上设置有摄像头和无载波通信模块,所述方法包括:通过无载波通信模块确定跟踪的目标与电子设备的第一相对位置信息;通过摄像头确定跟踪的目标与电子设备的第二相对位置信息;基于第一相对位置信息和第二相对位置信息,确定跟踪的目标与电子设备的第三相对位置信息;基于第三相对位置信息,控制电子设备跟踪所述目标行进。由于电子设备同时搭载无载波通信模块和摄像头,将无载波通信模块确定的第一相对位置信息和摄像头确定的第二相对位置信息进行融合,获得了准确性更高的第三相对位置信息,再利用第三相对位置信息控制电子设备跟踪所述目标行进,所以有效地解决了现有技术中的目标跟踪方法存在稳定性差,鲁棒性较低的技术问题,实现了提高目标跟踪方法的稳定性和鲁棒性的技术效果。
实施例二
本实施例提供了一种目标跟踪装置,应用于电子设备中,所述电子设备上设置有摄像头和无载波通信模块,如图5所示,所述目标跟踪装置,包括:
第一确定单元201,配置为通过无载波通信模块确定跟踪的目标与电子 设备的第一相对位置信息;
第二确定单元202,配置为通过摄像头确定跟踪的目标与电子设备的第二相对位置信息;
第三确定单元203,配置为基于第一相对位置信息和第二相对位置信息,确定跟踪的目标与电子设备的第三相对位置信息;
控制单元204,配置为基于第三相对位置信息,控制电子设备跟踪所述目标行进。
作为一种可选的实施例,第一相对位置信息包括:电子设备与跟踪的目标之间的第一距离和/或电子设备的行进方向与跟踪的目标的第一夹角;
第二相对位置信息包括:电子设备与跟踪的目标之间的第二距离和/或电子设备的行进方向与跟踪的目标的第二夹角;
第三相对位置信息包括:电子设备与跟踪的目标之间的第三距离和/或电子设备的行进方向与跟踪的目标的第三夹角。
作为一种可选的实施例,第一确定单元201,包括:
第一确定子单元,配置为通过无载波通信模块感应信标的位置,从而确定第一相对位置信息;其中,信标设置在跟踪的目标上。
作为一种可选的实施例,第二确定单元202,包括:
获取子单元,配置为获取摄像头采集的图像数据;
第二确定子单元,配置为在图像数据中确定跟踪的目标对应的目标模板;
第三确定子单元,配置为基于视觉跟踪算法,利用目标模板确定第二相对位置信息。
作为一种可选的实施例,所述第二确定子单元,具体配置为:
基于等式
Figure PCTCN2017073119-appb-000005
确定后续帧图像中的所述目标模板的大小;
其中,Pt为所述后续帧图像中的所述目标模板的大小,P0为初始帧图像 中的所述目标模板的大小,d0为在所述初始帧图像对应的初始时刻确定的所述第一距离,dt为在所述后续帧图像对应的后续时刻确定的所述第一距离,所述后续帧图像和所述初始帧图像属于所述图像数据,且所述后续帧图像位于所述初始帧图像之后。
作为一种可选的实施例,第三确定单元203,包括:
第四确定子单元,配置为基于等式d=duwb·cosθv,确定电子设备与跟踪的目标的第三距离:其中,d为第三距离,duwb为第一距离,θv为摄像头的俯仰角;
第五确定子单元,配置为基于等式θ=σ·θvision+(1-σ)·θuwb,确定电子设备的行进方向与跟踪的目标的第三夹角:其中,θ为第三夹角;θuwb为第一夹角;θvision为第二夹角;σ为0至1的常数,用于调整θvision和θuwb的权重。
作为一种可选的实施例,控制单元204,包括以下至少一项:
第一调节子单元,配置为基于第三夹角,调节电子设备的行进方向,使得电子设备朝向跟踪的目标行进;
第二调节子单元,配置为基于第三夹角,调节摄像头的仰俯角,使得摄像头对准跟踪的目标;
第三调节子单元,配置为基于第三距离,调节电子设备的行进速度。
实施例三
本发明实施例提供一种目标跟踪装置,应用于电子设备中,包括存储器和处理器,所述存储器中存储有可执行指令,所述可执行指令用于执行本发明前述实施例一记载的目标跟踪方法。
实施例四
本发明实施例提供一种非易失性存储介质,存储有可执行指令,所述 可执行指令用于执行本发明前述实施例一记载的目标跟踪方法。
示例性地,存储介质可以提供为闪存(Flash)、磁盘存储器、CD-ROM、光学存储器其中之一或结合的方式。
由于本实施例所介绍的电子设备为实施本发明实施例中信息处理的方法所采用的电子设备,故而基于本发明实施例中所介绍的信息处理的方法,本领域所属技术人员能够了解本实施例的电子设备的具体实施方式以及其各种变化形式,所以在此对于该电子设备如何实现本发明实施例中的方法不再详细介绍。只要本领域所属技术人员实施本发明实施例中信息处理的方法所采用的电子设备,都属于本发明所欲保护的范围。
上述本发明实施例中的技术方案,至少具有如下的技术效果或优点:
在本发明实施例中,公开了一种目标跟踪装置,应用于电子设备中,电子设备上设置有摄像头和无载波通信模块,所述目标跟踪装置,包括:第一确定单元,配置为通过无载波通信模块确定跟踪的目标与电子设备的第一相对位置信息;第二确定单元,配置为通过摄像头确定跟踪的目标与电子设备的第二相对位置信息;第三确定单元,配置为基于第一相对位置信息和第二相对位置信息,确定跟踪的目标与电子设备的第三相对位置信息;控制单元,配置为基于第三相对位置信息,控制电子设备跟踪所述目标行进。由于电子设备同时搭载无载波通信模块和摄像头,将无载波通信模块确定的第一相对位置信息和摄像头确定的第二相对位置信息进行融合,获得了准确性更高的第三相对位置信息,再利用第三相对位置信息控制电子设备跟踪所述目标行进,所以有效地解决了现有技术中的目标跟踪方法存在稳定性差,鲁棒性较低的技术问题,实现了提高目标跟踪方法的稳定性和鲁棒性的技术效果。
本领域内的技术人员应明白,本发明的实施例可提供为方法、系统、或计算机程序产品。因此,本发明可采用完全硬件实施例、完全软件实施 例、或结合软件和硬件方面的实施例的形式。而且,本发明可采用在一个或多个其中包含有计算机可用程序代码的计算机可用存储介质(包括但不限于磁盘存储器、CD-ROM、光学存储器等)上实施的计算机程序产品的形式。
本发明是参照根据本发明实施例的方法、设备(系统)、和计算机程序产品的流程图和/或方框图来描述的。应理解可由计算机程序指令实现流程图和/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流程和/或方框的结合。可提供这些计算机程序指令到通用计算机、专用计算机、嵌入式处理机或其他可编程数据处理设备的处理器以产生一个机器,使得通过计算机或其他可编程数据处理设备的处理器执行的指令产生用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的装置。
这些计算机程序指令也可存储在能引导计算机或其他可编程数据处理设备以特定方式工作的计算机可读存储器中,使得存储在该计算机可读存储器中的指令产生包括指令装置的制造品,该指令装置实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能。
这些计算机程序指令也可装载到计算机或其他可编程数据处理设备上,使得在计算机或其他可编程设备上执行一系列操作步骤以产生计算机实现的处理,从而在计算机或其他可编程设备上执行的指令提供用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的步骤。
尽管已描述了本发明的优选实施例,但本领域内的技术人员一旦得知了基本创造性概念,则可对这些实施例作出另外的变更和修改。所以,所附权利要求意欲解释为包括优选实施例以及落入本发明范围的所有变更和修改。
显然,本领域的技术人员可以对本发明进行各种改动和变型而不脱离 本发明的精神和范围。这样,倘若本发明的这些修改和变型属于本发明权利要求及其等同技术的范围之内,则本发明也意图包含这些改动和变型在内。
工业实用性
本发明实施例中,通过无载波通信模块确定跟踪的目标与电子设备的第一相对位置信息;通过摄像头确定跟踪的目标与电子设备的第二相对位置信息;基于第一相对位置信息和第二相对位置信息,确定跟踪的目标与所述电子设备的第三相对位置信息;基于第三相对位置信息,控制电子设备跟踪所述目标行进。能够提高目标跟踪方法的稳定性和鲁棒性的技术效果。

Claims (16)

  1. 一种目标跟踪方法,应用于电子设备中,所述电子设备上设置有摄像头和无载波通信模块,所述方法包括:
    通过所述无载波通信模块确定跟踪的目标与所述电子设备的第一相对位置信息;
    通过所述摄像头确定所述跟踪的目标与所述电子设备的第二相对位置信息;
    基于所述第一相对位置信息和所述第二相对位置信息,确定所述跟踪的目标与所述电子设备的第三相对位置信息;
    基于所述第三相对位置信息,控制所述电子设备跟踪所述目标行进。
  2. 如权利要求1所述的目标跟踪方法,其中,
    所述第一相对位置信息包括:所述电子设备与所述跟踪的目标之间的第一距离和/或所述电子设备的行进方向与所述跟踪的目标的第一夹角;
    所述第二相对位置信息包括:所述电子设备与所述跟踪的目标之间的第二距离和/或所述电子设备的行进方向与所述跟踪的目标的第二夹角;
    所述第三相对位置信息包括:所述电子设备与所述跟踪的目标之间的第三距离和/或所述电子设备的行进方向与所述跟踪的目标的第三夹角。
  3. 如权利要求1所述的目标跟踪方法,其中,所述通过所述无载波通信模块确定跟踪的目标与所述电子设备的第一相对位置信息,包括:
    通过所述无载波通信模块感应信标的位置,确定所述第一相对位置信息;其中,所述信标设置在所述跟踪的目标上。
  4. 如权利要求1所述的目标跟踪方法,其中,所述通过所述摄像头确定所述跟踪的目标与所述电子设备的第二相对位置信息,包括:
    获取所述摄像头采集的图像数据;
    在所述图像数据中确定所述跟踪的目标对应的目标模板;
    基于视觉跟踪算法,利用所述目标模板确定所述第二相对位置信息。
  5. 如权利要求4所述的目标跟踪方法,其中,在所述图像数据中确定所述跟踪的目标对应的目标模板,包括:
    基于等式
    Figure PCTCN2017073119-appb-100001
    确定后续帧图像中的所述目标模板的大小;
    其中,Pt为所述后续帧图像中的所述目标模板的大小,P0为初始帧图像中的所述目标模板的大小,d0为在所述初始帧图像对应的初始时刻确定的所述第一距离,dt为在所述后续帧图像对应的后续时刻确定的所述第一距离,所述后续帧图像和所述初始帧图像属于所述图像数据,且所述后续帧图像位于所述初始帧图像之后。
  6. 如权利要求2所述的目标跟踪方法,其中,所述基于所述第一相对位置信息和所述第二相对位置信息,确定所述跟踪的目标与所述电子设备的第三相对位置信息,包括:
    基于等式d=duwb·cosθv,确定所述电子设备与所述跟踪的目标的第三距离:
    其中,d为所述第三距离,duwb为所述第一距离,θv为所述摄像头的俯仰角;
    基于等式θ=σ·θvision+(1-σ)·θuwb,确定所述电子设备的行进方向与所述跟踪的目标的第三夹角:
    其中,θ为所述第三夹角;θuwb为所述第一夹角;θvision为所述第二夹角;σ为0至1的常数,用于调整θvision和θuwb的权重。
  7. 如权利要求2所述的目标跟踪方法,其中,所述基于所述第三相对位置信息,控制所述电子设备跟踪所述目标行进,包括以下至少一项:
    基于所述第三夹角,调节所述电子设备的行进方向,使得所述电子设备朝向所述跟踪的目标行进;
    基于所述第三夹角,调节所述摄像头的仰俯角,使得所述摄像头对准 所述跟踪的目标;
    基于所述第三距离,调节所述电子设备的行进速度。
  8. 一种目标跟踪装置,应用于电子设备中,所述电子设备上设置有摄像头和无载波通信模块,所述目标跟踪装置,包括:
    第一确定单元,配置为通过所述无载波通信模块确定跟踪的目标与所述电子设备的第一相对位置信息;
    第二确定单元,配置为通过所述摄像头确定所述跟踪的目标与所述电子设备的第二相对位置信息;
    第三确定单元,配置为基于所述第一相对位置信息和所述第二相对位置信息,确定所述跟踪的目标与所述电子设备的第三相对位置信息;
    控制单元,配置为基于所述第三相对位置信息,控制所述电子设备跟踪所述目标行进。
  9. 如权利要求8所述的目标跟踪装置,其中,
    所述第一相对位置信息包括:所述电子设备与所述跟踪的目标之间的第一距离和/或所述电子设备的行进方向与所述跟踪的目标的第一夹角;
    所述第二相对位置信息包括:所述电子设备与所述跟踪的目标之间的第二距离和/或所述电子设备的行进方向与所述跟踪的目标的第二夹角;
    所述第三相对位置信息包括:所述电子设备与所述跟踪的目标之间的第三距离和/或所述电子设备的行进方向与所述跟踪的目标的第三夹角。
  10. 如权利要求8所述的目标跟踪装置,其中,所述第一确定单元,包括:
    第一确定子单元,配置为通过所述无载波通信模块感应信标的位置,从而确定所述第一相对位置信息;其中,所述信标设置在所述跟踪的目标上。
  11. 如权利要求8所述的目标跟踪装置,其中,所述第二确定单元,包括:
    获取子单元,配置为获取所述摄像头采集的图像数据;
    第二确定子单元,配置为在所述图像数据中确定所述跟踪的目标对应的目标模板;
    第三确定子单元,配置为基于视觉跟踪算法,利用所述目标模板确定所述第二相对位置信息。
  12. 如权利要求11所述的目标跟踪装置,其中,所述第二确定子单元,具体配置为:
    基于等式
    Figure PCTCN2017073119-appb-100002
    确定后续帧图像中的所述目标模板的大小;
    其中,Pt为所述后续帧图像中的所述目标模板的大小,P0为初始帧图像中的所述目标模板的大小,d0为在所述初始帧图像对应的初始时刻确定的所述第一距离,dt为在所述后续帧图像对应的后续时刻确定的所述第一距离,所述后续帧图像和所述初始帧图像属于所述图像数据,且所述后续帧图像位于所述初始帧图像之后。
  13. 如权利要求9所述的目标跟踪装置,其中,所述第三确定单元,包括:
    第四确定子单元,配置为基于等式d=duwb·cosθv,确定所述电子设备与所述跟踪的目标的第三距离:其中,d为所述第三距离,duwb为所述第一距离,θv为所述摄像头的俯仰角;
    第五确定子单元,配置为基于等式θ=σ·θvision+(1-σ)·θuwb,确定所述电子设备的行进方向与所述跟踪的目标的第三夹角:其中,θ为所述第三夹角;θuwb为所述第一夹角;θvision为所述第二夹角;σ为0至1的常数,用于调整θvision和θuwb的权重。
  14. 如权利要求9所述的目标跟踪装置,其中,所述控制单元,包括以下至少一项:
    第一调节子单元,配置为基于所述第三夹角,调节所述电子设备的行 进方向,使得所述电子设备朝向所述跟踪的目标行进;
    第二调节子单元,配置为基于所述第三夹角,调节所述摄像头的仰俯角,使得所述摄像头对准所述跟踪的目标;
    第三调节子单元,配置为基于所述第三距离,调节所述电子设备的行进速度。
  15. 一种目标跟踪装置,应用于电子设备中,包括存储器和处理器,所述存储器中存储有可执行指令,所述可执行指令用于执行权利要求1至7任一项所述的目标跟踪方法。
  16. 一种非易失性存储介质,存储有可执行指令,所述可执行指令用于执行权利要求1至7任一项所述的目标跟踪方法。
PCT/CN2017/073119 2016-10-31 2017-02-08 一种目标跟踪方法及目标跟踪装置、存储介质 WO2018076572A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP17863668.4A EP3410062A4 (en) 2016-10-31 2017-02-08 TARGET TRACKING METHOD, TARGET TRACKING APPARATUS, AND MEMORY MEDIUM
US16/078,087 US20190049549A1 (en) 2016-10-31 2017-02-08 Target tracking method, target tracking appartus, and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201610971880.6 2016-10-31
CN201610971880 2016-10-31

Publications (1)

Publication Number Publication Date
WO2018076572A1 true WO2018076572A1 (zh) 2018-05-03

Family

ID=58866350

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/073119 WO2018076572A1 (zh) 2016-10-31 2017-02-08 一种目标跟踪方法及目标跟踪装置、存储介质

Country Status (4)

Country Link
US (1) US20190049549A1 (zh)
EP (1) EP3410062A4 (zh)
CN (1) CN106683123B (zh)
WO (1) WO2018076572A1 (zh)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108646750A (zh) * 2018-06-08 2018-10-12 杭州电子科技大学 基于uwb非基站便捷式工厂agv跟随方法
CN110689556A (zh) * 2019-09-09 2020-01-14 苏州臻迪智能科技有限公司 跟踪方法、装置及智能设备
CN111722625A (zh) * 2019-12-18 2020-09-29 北京交通大学 时变数量群体机器人接力目标跟踪系统的稳定性分析方法

Families Citing this family (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10636152B2 (en) * 2016-11-15 2020-04-28 Gvbb Holdings S.A.R.L. System and method of hybrid tracking for match moving
CN107255468B (zh) * 2017-05-24 2019-11-19 纳恩博(北京)科技有限公司 目标跟踪方法、目标跟踪设备及计算机存储介质
CN107608345A (zh) * 2017-08-26 2018-01-19 深圳力子机器人有限公司 一种机器人及其跟随方法和系统
CN108062763B (zh) * 2017-12-29 2020-10-16 纳恩博(北京)科技有限公司 目标跟踪方法及装置、存储介质
CN110096071A (zh) * 2018-01-31 2019-08-06 深圳市诚壹科技有限公司 一种跟踪控制方法、装置及移动终端
CN108931979B (zh) * 2018-06-22 2020-12-15 中国矿业大学 基于超声波辅助定位的视觉跟踪移动机器人及控制方法
US11159798B2 (en) * 2018-08-21 2021-10-26 International Business Machines Corporation Video compression using cognitive semantics object analysis
KR20200087887A (ko) * 2018-12-28 2020-07-22 현대자동차주식회사 차량 및 차량 제어 방법
CN109828596A (zh) * 2019-02-28 2019-05-31 深圳市道通智能航空技术有限公司 一种目标跟踪方法、装置和无人机
CN110346788A (zh) * 2019-06-14 2019-10-18 北京雷久科技有限责任公司 基于雷达和光电融合的高机动和悬停目标全航迹跟踪方法
US11537137B2 (en) * 2019-06-18 2022-12-27 Lg Electronics Inc. Marker for space recognition, method of moving and lining up robot based on space recognition and robot of implementing thereof
CN110977950B (zh) * 2019-11-12 2021-05-25 长沙长泰机器人有限公司 一种机器人抓取定位方法
KR102198904B1 (ko) 2020-01-09 2021-01-06 기술보증기금 분산 딥러닝 모델 기반 기술력평가용 인공지능 모듈을 생성하는 방법 및 인공지능모듈을 적용한 기술력예측방법과 이를 구현하는 시스템, 상기 방법이 구현된 프로그램을 기록한 컴퓨터가 읽을 수 있는 기록매체
US20210258540A1 (en) * 2020-02-13 2021-08-19 Nxp B.V. Motion monitoring and analysis system and method
CN111289944B (zh) * 2020-02-29 2021-10-08 杭州电子科技大学 一种基于uwb定位的无人艇位置航向测定方法
US11950567B2 (en) 2021-03-04 2024-04-09 Sky View Environmental Service Llc Condor monitoring systems and related methods
CN113191336B (zh) * 2021-06-04 2022-01-14 绍兴建元电力集团有限公司 基于图像识别的电力隐患识别方法及系统
CN113923592B (zh) * 2021-10-09 2022-07-08 广州宝名机电有限公司 目标跟随方法、装置、设备及系统
WO2023097577A1 (zh) * 2021-12-01 2023-06-08 浙江大学湖州研究院 一种可拓展的基于uwb和摄像头的相对定位的设备和方法
CN114413868A (zh) * 2022-02-09 2022-04-29 国网浙江省电力有限公司经济技术研究院 基于uwb与电控玻璃的全站仪瞄准反射棱镜系统及方法
CN116740379B (zh) * 2023-07-06 2024-07-16 江苏商贸职业学院 一种结合计算机视觉的目标追踪方法及系统
CN116681731B (zh) * 2023-08-02 2023-10-20 北京观微科技有限公司 目标物体追踪方法、装置、电子设备及存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102546680A (zh) * 2010-12-15 2012-07-04 北京航天长峰科技工业集团有限公司 一种室内人员定位跟踪系统
CN103139904A (zh) * 2011-11-30 2013-06-05 北京航天长峰科技工业集团有限公司 一种室内人员定位跟踪系统
CN103884332A (zh) * 2012-12-21 2014-06-25 联想(北京)有限公司 一种障碍物判定方法、装置及移动电子设备
CN104754515A (zh) * 2015-03-30 2015-07-01 北京云迹科技有限公司 混合定位辅助地图修正方法及系统
CN104777847A (zh) * 2014-01-13 2015-07-15 中南大学 基于机器视觉和超宽带定位技术的无人机目标追踪系统
CN105157681A (zh) * 2015-08-23 2015-12-16 西北工业大学 室内定位方法、装置和摄像机以及服务器

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7929017B2 (en) * 2004-07-28 2011-04-19 Sri International Method and apparatus for stereo, multi-camera tracking and RF and video track fusion
US7460951B2 (en) * 2005-09-26 2008-12-02 Gm Global Technology Operations, Inc. System and method of target tracking using sensor fusion
JP4413957B2 (ja) * 2007-08-24 2010-02-10 株式会社東芝 移動物体検出装置および自律移動物体
TWI388205B (zh) * 2008-12-19 2013-03-01 Ind Tech Res Inst 目標追蹤之方法與裝置
US8190330B2 (en) * 2009-03-06 2012-05-29 GM Global Technology Operations LLC Model based predictive control for automated lane centering/changing control systems
GB201110159D0 (en) * 2011-06-16 2011-07-27 Light Blue Optics Ltd Touch sensitive display devices
GB201205303D0 (en) * 2012-03-26 2012-05-09 Light Blue Optics Ltd Touch sensing systems
US9384668B2 (en) * 2012-05-09 2016-07-05 Singularity University Transportation using network of unmanned aerial vehicles
CN202838377U (zh) * 2012-10-26 2013-03-27 北京航天长峰科技工业集团有限公司 一种室内人员定位跟踪系统
US9453904B2 (en) * 2013-07-18 2016-09-27 Golba Llc Hybrid multi-camera based positioning
US10099609B2 (en) * 2014-07-03 2018-10-16 InfoMobility S.r.L. Machine safety dome
US10496766B2 (en) * 2015-11-05 2019-12-03 Zoox, Inc. Simulation system and methods for autonomous vehicles
EP3397921A1 (en) * 2015-12-30 2018-11-07 Faro Technologies, Inc. Registration of three-dimensional coordinates measured on interior and exterior portions of an object
CN105931263B (zh) * 2016-03-31 2019-09-20 纳恩博(北京)科技有限公司 一种目标跟踪方法及电子设备
CN105915784A (zh) * 2016-04-01 2016-08-31 纳恩博(北京)科技有限公司 信息处理方法和装置
US10726567B2 (en) * 2018-05-03 2020-07-28 Zoox, Inc. Associating LIDAR data and image data
US11061132B2 (en) * 2018-05-21 2021-07-13 Johnson Controls Technology Company Building radar-camera surveillance system
US11082667B2 (en) * 2018-08-09 2021-08-03 Cobalt Robotics Inc. Contextual automated surveillance by a mobile robot

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102546680A (zh) * 2010-12-15 2012-07-04 北京航天长峰科技工业集团有限公司 一种室内人员定位跟踪系统
CN103139904A (zh) * 2011-11-30 2013-06-05 北京航天长峰科技工业集团有限公司 一种室内人员定位跟踪系统
CN103884332A (zh) * 2012-12-21 2014-06-25 联想(北京)有限公司 一种障碍物判定方法、装置及移动电子设备
CN104777847A (zh) * 2014-01-13 2015-07-15 中南大学 基于机器视觉和超宽带定位技术的无人机目标追踪系统
CN104754515A (zh) * 2015-03-30 2015-07-01 北京云迹科技有限公司 混合定位辅助地图修正方法及系统
CN105157681A (zh) * 2015-08-23 2015-12-16 西北工业大学 室内定位方法、装置和摄像机以及服务器

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3410062A4 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108646750A (zh) * 2018-06-08 2018-10-12 杭州电子科技大学 基于uwb非基站便捷式工厂agv跟随方法
CN108646750B (zh) * 2018-06-08 2021-05-07 杭州电子科技大学 基于uwb非基站便捷式工厂agv跟随方法
CN110689556A (zh) * 2019-09-09 2020-01-14 苏州臻迪智能科技有限公司 跟踪方法、装置及智能设备
CN111722625A (zh) * 2019-12-18 2020-09-29 北京交通大学 时变数量群体机器人接力目标跟踪系统的稳定性分析方法

Also Published As

Publication number Publication date
EP3410062A1 (en) 2018-12-05
US20190049549A1 (en) 2019-02-14
EP3410062A4 (en) 2019-03-06
CN106683123A (zh) 2017-05-17
CN106683123B (zh) 2019-04-02

Similar Documents

Publication Publication Date Title
WO2018076572A1 (zh) 一种目标跟踪方法及目标跟踪装置、存储介质
US10928838B2 (en) Method and device of determining position of target, tracking device and tracking system
US10942529B2 (en) Aircraft information acquisition method, apparatus and device
US11086337B2 (en) Systems and methods for charging unmanned aerial vehicles on a moving platform
EP2972462B1 (en) Digital tethering for tracking with autonomous aerial robot
US20210129982A1 (en) System and method for drone tethering
CN105929850B (zh) 一种具有持续锁定和跟踪目标能力的无人机系统与方法
CN111932588A (zh) 一种基于深度学习的机载无人机多目标跟踪系统的跟踪方法
JP5990453B2 (ja) 自律移動ロボット
US11906983B2 (en) System and method for tracking targets
De Smedt et al. On-board real-time tracking of pedestrians on a UAV
US20190004547A1 (en) Methods and apparatus of tracking moving targets from air vehicles
US20200125100A1 (en) Movable object control method, device and system
US10846541B2 (en) Systems and methods for classifying road features
JP2014119828A (ja) 自律飛行ロボット
Nambi et al. ALT: towards automating driver license testing using smartphones
US11967112B2 (en) Method and apparatus for detecting calibration requirement for image sensors in vehicles
US20200125111A1 (en) Moving body control apparatus
Xu et al. A stereo visual navigation method for docking autonomous underwater vehicles
TWI773112B (zh) 道路監測系統、裝置及方法
US20240037759A1 (en) Target tracking method, device, movable platform and computer-readable storage medium
WO2022198509A1 (zh) 目标跟踪方法、模型训练方法、装置、可移动平台及存储介质
KR101784419B1 (ko) 사용자 시점 원격 디바이스 제어 장치 및 그 방법
WO2020088397A1 (zh) 位置推定装置、位置推定方法、程序以及记录介质
CN114296479A (zh) 一种基于图像的无人机对地面车辆跟踪方法及系统

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 2017863668

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2017863668

Country of ref document: EP

Effective date: 20180828

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17863668

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE