WO2018214909A1 - 目标跟踪方法、目标跟踪设备及计算机存储介质 - Google Patents

目标跟踪方法、目标跟踪设备及计算机存储介质 Download PDF

Info

Publication number
WO2018214909A1
WO2018214909A1 PCT/CN2018/088020 CN2018088020W WO2018214909A1 WO 2018214909 A1 WO2018214909 A1 WO 2018214909A1 CN 2018088020 W CN2018088020 W CN 2018088020W WO 2018214909 A1 WO2018214909 A1 WO 2018214909A1
Authority
WO
WIPO (PCT)
Prior art keywords
tracking
determining
wireless signal
tracking information
information
Prior art date
Application number
PCT/CN2018/088020
Other languages
English (en)
French (fr)
Inventor
唐矗
Original Assignee
纳恩博(北京)科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 纳恩博(北京)科技有限公司 filed Critical 纳恩博(北京)科技有限公司
Publication of WO2018214909A1 publication Critical patent/WO2018214909A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/02Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • G01S13/72Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras

Definitions

  • the present application relates to the field of electronic technologies, but is not limited to the field of electronic technologies, and in particular, to a target tracking method, a target tracking device, and a computer storage medium.
  • the tracking device uses the visual system to perform target tracking through image acquisition; however, in the related art, there are usually various interference factors, and on the one hand, the interference factor easily causes the target to be lost; On the one hand, once the interference factor causes the tracking target to be lost, it is difficult to retrieve the tracking target.
  • embodiments of the present application are expected to provide a target tracking method, a target tracking device, and a computer storage medium.
  • a first aspect of the embodiments of the present application provides a target tracking method, which is applied to a target tracking device, and includes:
  • a second aspect of the embodiments of the present application provides a target tracking device, including:
  • An acquisition unit configured to acquire a tracking image
  • a first acquiring unit configured to acquire first tracking information for visual tracking of the target object based on the tracking image
  • a detecting unit configured to detect a wireless signal for performing the target tracking
  • a second acquiring unit configured to parse the wireless signal, and acquire second tracking information that performs wireless signal tracking on the target object
  • the determining unit is configured to determine the final tracking information of the target object by combining the first tracking information and the second tracking information.
  • a third aspect of the embodiments of the present application provides a target tracking device, including:
  • An image acquisition module configured to collect a tracking image
  • An antenna module configured to detect a wireless signal
  • a memory configured to store a computer program
  • the processor is respectively connected to the image acquisition module, the antenna module and the memory, and is configured to execute the target tracking method provided by any one or more of the foregoing technical solutions by executing the computer program.
  • a fourth aspect of the embodiments of the present application provides a computer storage medium, where the computer storage medium stores a computer program; after the computer program is executed by the processor, the target tracking method provided by any one or more of the foregoing technical solutions can be executed.
  • the target tracking device uses two tracking methods to track the same target object, so that when one of the tracking fails, as long as the other tracking is effective, the effective tracking of the target object is maintained, thereby solving the single tracking method. Tracking the problem of high loss rate and improving the success rate of tracking;
  • the tracking information of the tracking method can be tracked, so that the tracking method of the tracking failure is restored to be effective, and the target object is retrieved again. Restore the failure tracking method, which reduces the difficulty of the target object to retrieve.
  • the tracking mode used by the target tracking device at the same time is a kind of visual tracking, tracking information is obtained through tracking image acquisition and analysis, and another wireless signal tracking is performed, and tracking information is obtained through wireless signal detection.
  • the tracking mode is mutually assisted. The probability of simultaneous failure of the two tracking methods is low, which reduces the probability of the target object and the loss again, and improves the tracking success rate again. .
  • FIG. 1 is a schematic flowchart diagram of a first target tracking method according to an embodiment of the present application
  • FIG. 2 is a schematic flowchart of a second target tracking method according to an embodiment of the present application.
  • FIG. 3 is a schematic diagram of a display effect of a tracking image according to an embodiment of the present application.
  • FIG. 4 is a view showing a display effect of an alternative tracking area included in FIG. 3;
  • FIG. 5 is a schematic structural diagram of a tracking device according to an embodiment of the present application.
  • FIG. 6 is a schematic structural diagram of another tracking device according to an embodiment of the present disclosure.
  • FIG. 7 is a schematic flowchart diagram of a third target tracking method according to an embodiment of the present application.
  • This embodiment provides a target tracking method, which is applied to a target tracking device. As shown in FIG. 1 , the method includes:
  • Step S110 collecting a tracking image
  • Step S120 Acquire first tracking information for visual tracking of the target object based on the tracking image; wherein the first tracking information is first tracking information for visually tracking the target object;
  • Step S130 detecting a wireless signal for performing the target tracking
  • Step S140 Parsing the wireless signal, and acquiring second tracking information for performing wireless signal tracking on the target object;
  • Step S150 Combine the first tracking information and the second tracking information to determine final tracking information of the target object.
  • the target tracking method provided in this embodiment may be an information processing method applied to the target tracking device.
  • the target tracking device may be a mobile robot moving on the ground or a flying robot (such as a drone) flying in the air.
  • the target tracking device integrates an acquisition module and a wireless signal detection module; the acquisition module can be used for pattern acquisition to implement visual tracking; the wireless signal detection module can be used to pass Tracking and transceiving the wireless signal of the target object, performing wireless signal tracking, may include at least a receiving antenna of the wireless signal, and in some embodiments may also include a transmitting antenna that transmits the wireless signal.
  • the wireless signal may be a UWB signal, and the UWB may be an abbreviation of Ultra Wideband, and the UWB signal is a pulse signal with a predetermined length of time. The predetermined length may be a signal in the order of nanoseconds to picoseconds.
  • the pulse signal can be a non-sinusoidal pulse signal.
  • the wireless signal can be a variety of radio frequency signals.
  • the step S110 collects a tracking image
  • the first tracking information can be obtained by parsing the collected tracking image
  • the first tracking information herein may include: the tracked target object.
  • the tracking image may be parsed, determining an imaging position, an imaging area, and the like of the target object in the tracking image, and then based on the acquisition parameter (eg, The tracking device acquires the image acquisition direction, the focus position, and the like, and can convert the distance and/or angle of the target object relative to the target tracking device.
  • the acquisition parameter eg, The tracking device acquires the image acquisition direction, the focus position, and the like, and can convert the distance and/or angle of the target object relative to the target tracking device.
  • the camera that captures the tracking image may be comprised of a depth camera and a red, green, and blue (RGB) camera.
  • the depth camera and the RGB camera respectively collect images, and then, when analyzing the depth image acquired by the depth camera and the RGB image acquired by the RGB camera, combined with the positional relationship between the depth camera and the RGB camera, the mapping of the target object on the RGB image and the depth image is determined.
  • the relationship further locates the first tracking information such as the distance and/or angle of the target object relative to the tracking device.
  • the target object may be various tracked objects, and may include various tracked objects such as a person or an animal.
  • the target object may be a living body, and in some cases, the target object may also be another object.
  • a mobile device. Tracking devices can be used to track another device that can be moved.
  • the wireless signal is detected in the step S130.
  • the achievable manner of the step S130 in this embodiment may include multiple types:
  • the first type receiving the wireless signal sent by the transmitting device carried by the target object;
  • the tracking device transmits a first signal, and detects a second signal that is reflected back based on the first signal; the first signal and the second signal are both one of wireless signals, and the target tracking device needs A second signal reflected back after the first signal is applied to the target object is detected.
  • the wireless signal transmitted from the target object is received in step S130 regardless of the manner.
  • the parameters of the direction and distance of the target object with respect to the tracking device may be located, and, for example, according to the signal strength of the received wireless signal and/or Or receiving power, combined with the known transmission intensity and/or transmission power, using the transmission loss model to determine information such as the distance and/or angle of the target object relative to the tracking device, thereby obtaining the second tracking. information.
  • the wireless signal tracking may be based on a wireless signal transmitted between the target object and the target tracking device, using Two-way Ranging (TWR) to calculate the target object and target tracking.
  • TWR Two-way Ranging
  • the relative distance between devices; TWR is a two-way ranging method.
  • the communication units respectively set on the two target devices and the tracking device calculate the flight time of the signal according to the time difference between the signals transmitted and received by each other, and can calculate according to this.
  • the relative distance between communication units may be based on a wireless signal transmitted between the target object and the target tracking device, using Two-way Ranging (TWR) to calculate the target object and target tracking.
  • TWR is a two-way Ranging method.
  • the wireless signal tracking may also calculate the relative distance between the target object and the target tracking device by using a phase difference of Arrival (PDOA) based on the wireless signal transmitted between the target object and the target tracking device.
  • PDOA phase difference of Arrival
  • the relative angle; PDOA referred to as the signal arrival phase difference is a method of positioning using the phase difference, by measuring the phase difference of the signal arriving at the monitoring station, the relative distance and angle of the signal source and the monitoring station can be determined.
  • step S110 may precede the step S130, or the step S130 may precede the step S110.
  • the step S110 and the step S130 are simultaneously performed.
  • the step S110 to the step S120 may be repeatedly performed according to the first time interval, and the steps S130 to S140 may be repeatedly performed according to the second time interval. Any two adjacent time intervals of the first time interval may be equal, thereby implementing periodic execution; any two time intervals of the second time interval may also be equal, and periodic execution is also implemented.
  • any of the first time intervals and/or the second time intervals may not be equal.
  • the first time interval and the second time interval may be dynamically determined. For example, when the first tracking information of the historical moment determines that the moving rate of the target object satisfies the first rate condition, for example, greater than the first preset rate threshold, the first time interval is shortened, if the The first rate condition may increase the first time interval. Thus obviously, the first time interval is dynamically determined.
  • the second time interval is shortened.
  • the second rate condition can be increased by the second time interval.
  • the second time interval is dynamically determined.
  • the frequency of execution of the step S150 is determined by the lower frequency of obtaining the first tracking information and the second tracking information.
  • the execution time interval of the step S110 to the step S120 is equal to the execution time interval of the step S130 to the step S140, that is, the first tracking information and the second tracking information are obtained synchronously.
  • two tracking methods are introduced to perform tracking, and the two tracking methods are visual tracking, target tracking through image acquisition, and the other is based on wireless.
  • the detection of the signal performs tracking of the target object; finally, the two types of tracking information obtained by the two tracking methods (ie, the first tracking information and the second tracking information) determine the final tracking information of the target object.
  • another tracking method can also provide tracking information, and the probability of two tracking methods failing simultaneously with respect to the target tracking device is relative to the probability of tracking failure. Low, so compared with the target tracking device, the probability of tracking failure is reduced, and the tracking success rate is improved.
  • the two tracking methods used in the embodiment are visual tracking and the other is wireless signal tracking.
  • the two tracking methods have different differences and can be applied to different tracking scenarios, and the visual tracking is easy to be invalid.
  • the tracking ability of the wireless signal tracking is strong.
  • the visual tracking may be a strong tracking capability, thereby further reducing the problem that the two tracking modes fail simultaneously, and again improving the success rate of the target tracking. .
  • the two tracking methods may utilize respective tracking algorithms to determine whether a tracking invalid condition occurs at the current time.
  • the visual tracking it is determined whether the visual tracking is valid by detecting whether or not the imaging of the target object is successfully located in the currently acquired tracking image.
  • whether wireless signal tracking is valid is determined by whether the wireless signal is detected within the current detection period.
  • determining a distance and/or an angle of the target object relative to the tracking device based on the first tracking information, determining a distance and/or an angle of the target object relative to the tracking device; when the distance is less than a certain distance specified value, and/or the angle is less than a certain angle specified value, the visual tracking is considered to be valid, Otherwise the visual tracking will be invalid.
  • the wireless signal tracking is considered to be valid. Otherwise, the wireless signal tracking fails.
  • the step S150 may include:
  • Step S151 Combining the first tracking information and the second tracking information, determining whether the visual tracking and the wireless signal tracking have a target loss;
  • Step S152 Determine the final tracking information according to the judgment result of the judgment.
  • the first tracking information and the second tracking information are combined to determine the accuracy of the improved judgment. Finally, based on the judgment result, the final tracking information is determined.
  • the final tracking information in this embodiment may also include information on the distance and/or angle of the target object with respect to the tracking device.
  • the step S151 may include: calculating an approximate degree of the first tracking information and the second tracking information;
  • the step S152 may include: comparing the calculated value of the approximate degree with a preset first threshold, and determining, when the result of the comparing indicates that the proximity does not reach a preset first proximity requirement, determining The visual tracking and at least one occurrence of the wireless signal tracking are lost.
  • determining whether the approximation does not reach the first approximation requirement may include: the approximation is a position difference indicated by two tracking information, where the position difference is not less than a specific threshold, and if the threshold is less than a specific threshold, It can be considered that the first approximation requirement is reached, otherwise it is considered that the approximation requirement is not met.
  • the particular threshold may be one of the first thresholds.
  • the degree of approximation of the first tracking information and the second tracking information is calculated, and the manner of determining based on the degree of approximation is also various. Several options are provided below:
  • the step S151 may include:
  • first tracking information and the second tracking information both include: a distance and an angle of a target object detected by various tracking methods with respect to the target tracking device, construct a coordinate system with the target tracking device as an origin, based on Determining, by the first tracking information, a first coordinate value in the coordinate system; determining a second coordinate value in the coordinate system according to the second tracking information; and based on the first coordinate value and the second coordinate
  • the value constructs a difference vector, and the modulus of the difference vector is calculated.
  • the step S152 may include: when the modulus value is not less than a predetermined mode threshold, at least one target of the visual tracking and the wireless signal tracking may be considered to be lost.
  • the step S151 may include:
  • the approximate degree is calculated based on the first average distance, the second average distance, the first average angle, and the second average angle.
  • a first average distance and a first average angle of the target object relative to the target tracking device in the first preset time may be calculated; and based on the second tracking information, the first preset inter-target may be calculated The first average distance and the second average angle of the object relative to the target tracking device.
  • the two average distances and two average angles are combined to calculate the approximate degree.
  • the first average distance can be expressed as The first average angle can be expressed as The second average distance can be expressed as The second average angle can be expressed as The approximation S conf can be calculated using the following formula;
  • the average distance difference between the first average distance and the second average distance, and the average angle difference between the first average angle and the second average angle may be directly combined to determine whether at least one target is lost.
  • the visual tracking is effective or the wireless signal tracking is invalid, it may first determine whether the corresponding tracking mode is valid based on one tracking information, and if the corresponding tracking mode is valid, the other tracking mode is invalid.
  • the first mode, the step S152, further includes:
  • step S152 further includes:
  • determining whether the second tracking information has a continuity change is based on the continuity of the movement of the target object, and may include:
  • the second tracking information has a continuous change in the second preset time
  • the target tracking device can obtain the second tracking information multiple times, and the second tracking information obtained multiple times is changed, thus maintaining the target tracking device and Continuous tracking between target objects, and indicates that during the acquisition time of multiple tracking information, the target object is still within the tracking range of the target tracking device, so the target tracking device currently tracks the target object continuously. Effective tracking of sex.
  • the determining whether the second tracking information has a continuity change in the embodiment may further include:
  • the first preset condition herein may include: the tracking device maintains the speed movement within its predetermined speed range during the continuous tracking, and the current first current relative position parameter indicates that the distance between the target object and the tracking device is always maintained. Within a certain range, there is no case where the distance between the sudden is very large or the distance from the tracking device is zero.
  • the approximation in this embodiment can characterize the similarity of the two tracking information.
  • the step S152 may further include:
  • the second mode is used in combination with the third mode.
  • the third manner is further used to determine that the first current relative position does not satisfy the first preset condition.
  • the second current relative position satisfies the second preset condition, it is determined that the visual tracking is valid and the wireless signal tracking fails.
  • the second mode and the third mode are used in combination, and may further include:
  • the final judgment result may be determined according to the confidence of the visual tracking and the wireless signal tracking. For example, assuming that the confidence of the visual tracking is higher than the confidence of the wireless signal, it is confirmed that the determination result that the visual tracking is valid and the wireless signal tracking is invalid is the final determination result. Assuming that the confidence level of the visual tracking is not higher than the confidence level of the wireless signal, it is confirmed that the determination result that the visual tracking is invalid and the wireless signal tracking is valid is the final judgment result.
  • the confidence of the two tracking methods here may be received from other devices, or may be determined by tracking the previous tracking failure statistics of the device itself; for example, according to historical statistical results, the frequency of visual tracking failure is higher than wireless signal tracking.
  • the frequency of the visual tracking is determined to be lower than the tracking degree of the wireless signal tracking, otherwise the confidence of the wireless signal is determined to be higher than the confidence of the visual tracking.
  • the confidence level may be a degree of credibility that characterizes the tracking parameter currently given by the corresponding tracking mode.
  • the determining whether the first relative position parameter meets the first preset condition comprises:
  • the movement of the target object has a certain continuity, and the movement rate does not generally occur sharply.
  • the current current position parameter (which may include at least: the current distance, In other implementations, the current angle may be included, and if compared with the first specific value, if the first specific value is smaller than the first specific value, the current first current position parameter may be considered to satisfy the first preset condition, otherwise it may be considered as not satisfied.
  • the current time is t0
  • the historical time 1 is t1
  • the historical time 2 is t2
  • the first distance between t0 and t1 and t2 is calculated respectively, and two first distances are obtained, and the two first distances and the calculated distances are respectively calculated.
  • the second specific value is compared and judged. When all the first distances are smaller than the first specific value, it may be determined that the first current position parameter satisfies the first preset condition.
  • the second tracking information detected by the current time and the relative position information corresponding to the second tracking information detected by the one or more historical moments may be calculated to obtain a plurality of first distances, which are smaller than the plurality of first distances.
  • the ratio of the two specific values is less than the preset ratio, determining that the first current location parameter satisfies the first preset condition is equivalent to determining that the wireless signal tracking is valid.
  • the relative position parameters of the plurality of historical moments relative to the tracking device may be determined based on the plurality of second tracking information, and then the average value of the relative position parameters of the plurality of historical moments is calculated, and then the first current is calculated.
  • the first distance is obtained by the distance of the position parameter from the average value. If the first distance is greater than the first specific value, the first current position parameter may be considered to satisfy the first preset condition.
  • the first specific value herein may be a preset value, which may be determined according to the initial distance that the target tracking device starts tracking the target object. During the tracking process, the target tracking device dynamically adjusts its own moving rate to maintain consistency with the moving rate of the target object such that the target object is always within its tracking range. Then, the first threshold may be c times the initial distance.
  • c may be a positive integer less than 1, and the value may be a value of 0.1, 0.2, or the like.
  • Determining whether the second current relative position parameter meets the second preset condition comprises:
  • the second historical relative position parameter is first tracking information detected based on a previous historical time of the current time Determining a relative position parameter of the target object relative to the tracking device, or the second historical relative position parameter is the target determined based on first tracking information detected by at least two historical moments before the current moment The average of the relative positional parameters of the object relative to the tracking device;
  • the judgment of whether or not the wireless signal tracking is effective in the present embodiment can also be judged based on the continuity of the target object movement. Therefore, in the determining of the second specific value in the embodiment, the second historical relative parameter is similar to the first historical relative parameter determination, and is not repeated here.
  • the first specific value and the second specific value may be the same.
  • the first specific value may be determined according to the accuracy of the visual tracking
  • the second specific value may be determined according to the accuracy of the wireless signal, then at this time, the second specific value and The first specific value may be different.
  • the first preset condition and/or the second preset condition are determined to determine whether the corresponding tracking mode is valid.
  • the tracking of the different tracking modes of the target tracking device may also be determined in advance.
  • An effective classifier which may be a neural network or a vector learning machine or the like.
  • the classifier When determining whether the corresponding tracking mode is valid, the current time and the tracking information detected by the plurality of historical moments adjacent to the current time may be input to the classifier; the classifier naturally gives a corresponding judgment result. Thereby, the amount of calculation for judging whether or not the corresponding tracking mode is effective can be reduced.
  • the tracking can be performed by tracking the effective tracking information as a positive example, and using the tracking information of the tracking invalid as the negative example to train, and the verified classifier is used as the tracking process of the subsequent tracking device. Valid or invalid judgment.
  • the method further includes the following steps.
  • the method also includes one or more scenarios:
  • Scenario 1 The method includes:
  • the visual tracking is acquired to a tracking image that includes the target object.
  • the method further includes:
  • the signal detection direction based on the wireless signal tracking and the visual tracking acquire the acquisition direction of the tracking image, and determine the signal detection direction of the wireless signal tracking.
  • the acquisition direction may be the orientation of the camera that collects the tracking avatar; when the orientation of the camera moves, the image content of the tracking image captured by the tracking image changes.
  • the wireless signal detected in step S130 is transmitted from the location of the target object. If the antenna of the wireless signal is detected to face the transmission direction of the wireless signal, a phenomenon of wireless signal miss detection may occur. Therefore, in this embodiment, according to the acquisition direction and the signal detection direction, it is determined whether the acquisition direction of the visual tracking and the first adjustment parameter need to be adjusted.
  • the first adjustment parameter herein may include: a first adjustment amount and a first adjustment direction. If the current adjustment parameter may include: the adjustment amount is 5 degrees, the method may further include: adjusting the direction, whether to adjust 5 degrees to the left or 5 degrees to the right.
  • the second adjustment parameter herein may also include: the second adjustment amount and the Second, adjust the direction.
  • the acquisition direction of the visual tracking is consistent with the signal detection direction of the wireless signal tracking from the beginning.
  • the calculation and direction adjustment of the adjustment parameters of the acquisition direction and the signal detection direction are reduced.
  • the detection antenna of the camera and the wireless signal is integrated in one structure in the tracking device, and the acquisition direction of the camera and the signal detection direction of the wireless signal are always consistent.
  • the determining the current acquisition direction includes: maintaining a current acquisition direction of the tracking device; when the signal detection direction is consistent with the acquisition direction, the determining a signal detection direction of the wireless signal tracking includes maintaining a current signal detection direction of the tracking device.
  • the acquisition direction of the camera may not coincide with the signal detection direction.
  • the requirement is to determine the adjustment parameters based on the acquisition direction and the signal detection direction.
  • the acquisition direction or the signal detection direction corresponding to the currently valid tracking mode is the adjustment target, and the adjustment parameter is determined.
  • the determining the current acquisition direction includes:
  • Determining a signal detection direction of the wireless signal tracking, when the signal detection direction is inconsistent with the acquisition direction including: obtaining a relative position parameter of the target object relative to the tracking device based on the first tracking information, based on Determining the signal detection direction by describing the relative position parameter and the acquisition direction.
  • the step S150 may include at least one of the following:
  • the final tracking information is calculated by using the first tracking information and the second tracking information as dependent variables of a preset function relationship.
  • step S150 tracking information is obtained as the final tracking parameter in an effective tracking manner.
  • step S150 the tracking information of the two tracking modes may be combined, and the tracking information may be calculated by processing such as mean calculation.
  • the tracking information with a higher confidence tracking mode can be directly used as the last tracking information.
  • the wireless signal tracking is UWB signal tracking
  • the tracking parameter of the UWB tracking can be directly used as the final tracking information.
  • the priority of the two tracking modes may be preset. For example, in the tracking mode of UWB tracking and visual tracking, the priority of the visual tracking may be set to be higher than the priority of the UWB.
  • the method further includes:
  • the process of tracking the target object based on the second tracking information obtaining first tracking information obtained based on visual tracking, and calculating an approximate degree of the first tracking information and the second tracking information, The value of the obtained approximation is compared with a preset second threshold, and when the result of the comparison indicates that the approximation reaches a preset second approximation requirement, it is determined that the visual tracking is restored to be valid.
  • the method further includes:
  • Second tracking information obtained based on wireless signal tracking in the process of tracking the target object based on the first tracking information, and calculating an approximate degree of the first tracking information and the second tracking information, and calculating the approximated
  • the value of the degree is compared with a preset third threshold. When the result of the comparison indicates that the proximity reaches a preset third proximity requirement, it is determined that the wireless signal tracking is restored to be valid.
  • the values of the first threshold, the second threshold, and the third threshold may be the same or different.
  • the manner of calculating the degree of approximation based on the first tracking information and the second tracking information may be the same as the foregoing calculation manner of determining whether at least one target loss occurs.
  • the relative position parameter of the target object relative to the tracking device satisfies a specific condition according to the first tracking information or the second tracking information.
  • the calculation of the aforementioned approximation is triggered only.
  • the corresponding approximation is compared with a second threshold or a third threshold to reduce the amount of calculation and reduce the number of recovery verifications.
  • determining whether the visual tracking is restored is effective may further include:
  • the first image feature may include an outer contour feature of the tracked target object, for example, the tracking target is a person, and the first image feature may be a human body outer edge feature.
  • the second image feature can include the first image feature, but in addition to the first image feature including more detail features, the target object can be identified from a plurality of objects of the same type.
  • the facial contour features of the tracked user are tracked for exclusive skin color features, clothing and wear features of the tracked object.
  • the first image feature and the second image feature may include: a contour feature, a texture feature, a color feature, and a feature size feature.
  • the target object is a tracked person
  • the tracking image may include: a plurality of people's graphics, and may further include: a figure of the small cart and the container, and then, based on the outer contour feature of the person, A region of the tracking image in which the graphic of the person is displayed is segmented as the candidate image region, and further image feature extraction in the candidate image region is performed, and the feature value matching the second image feature is obtained, and the matching degree is determined.
  • the matching degree here may include: a parameter that matches the number of successful features in all the second image features to be matched, and the like.
  • the matching degree is greater than the fourth threshold, it may be considered that the target object is detected in the current tracking image, and the current tracking recovery may be considered valid.
  • FIG. 3 is a tracking image
  • FIG. 4 is a schematic diagram of the tracking image shown in FIG. The area enclosed by the dotted line in Fig. 4 is the candidate image area.
  • the method further includes:
  • the relative positional parameter comprises: a relative distance and/or a relative angle
  • the method further includes:
  • the relative positional parameter comprises: a relative distance and/or a relative angle
  • the second tracking information adjusted according to the signal detection direction is directly determined, and the relative position parameter of the target object and the tracking device is determined.
  • the relative distance is determined to be less than the fifth threshold, and/or the angle is smaller than For the sixth threshold, the tracking recovery is considered valid, otherwise the signal detection direction needs to be continuously adjusted to ensure that the tracking is effective.
  • the embodiment provides a target tracking device, including:
  • the collecting unit 110 is configured to collect a tracking image
  • the first obtaining unit 120 is configured to acquire first tracking information for visual tracking of the target object based on the tracking image;
  • the detecting unit 130 is configured to detect a wireless signal that performs the target tracking
  • the second acquiring unit 140 is configured to parse the wireless signal, and acquire second tracking information that performs wireless signal tracking on the target object.
  • the determining unit 150 is configured to determine final tracking information of the target object by combining the first tracking information and the second tracking information.
  • the tracking device may be various mobile electronic devices such as a ground mobile robot or a flying robot.
  • the collection unit 110 may include: an acquisition module capable of image acquisition, the collection module may include: one or more cameras for image acquisition, the camera may include: a color camera, a depth camera And a stereo camera for 3D image acquisition.
  • the depth camera can automatically filter out the color information through the depth image acquisition to realize the hiding of the target object and color-related secret information.
  • the detecting unit 130 may correspond to an antenna or the like capable of receiving a wireless signal.
  • the first obtaining unit 120, the second obtaining unit 140, and the determining unit 150 may correspond to a processor or the like in the tracking device.
  • the processor can include a central processing unit, a microprocessor, a digital signal processor, a programmable array or an application processor, and the like.
  • the processor can obtain the first tracking information and the second tracking information by executing the computer program, and finally determine the final tracking information of the target tracking.
  • the tracking device provided in this embodiment has less target tracking loss and high tracking success rate.
  • the determining unit 150 is configured to determine, according to the first tracking information and the second tracking information, whether the visual tracking and the wireless signal tracking have a target loss; according to the determining The result of the determination determines the final tracking information.
  • the determining unit 150 is configured to calculate an approximate degree of the first tracking information and the second tracking information, and compare the value of the calculated approximateness with a preset first threshold. At least one occurrence target loss of the visual tracking and the wireless signal tracking is determined when the result of the comparison characterizes that the proximity does not reach a preset first proximity requirement.
  • the determining unit 150 may correspond to a calculator and a processor having a computing function, and the proximity may be calculated.
  • the determining unit 150 is configured to calculate, according to the first tracking information, a first average distance and a first average angle of the target object relative to the tracking device within a first preset time Calculating, according to the second tracking information, a second average distance and a second average angle of the target device relative to the tracking device in a first preset time; based on the first average distance, the second The approximate distance, the first average angle, and the second average angle are calculated.
  • the determining unit 150 is configured to compare the calculated approximated value with a preset first threshold, and the result of the comparing indicates that the approximate degree does not reach a preset first
  • the proximity is required, it is directly determined that the visual tracking is invalid and the wireless signal tracking is valid; or, whether the second tracking information in the second preset time has a continuity change, and if the determination result is yes, It is determined that the wireless signal tracking is valid and the visual tracking is invalid. If the determination result is no, it is determined that the visual tracking is valid and the wireless signal tracking is invalid.
  • the determining unit 150 is further configured to compare the calculated approximated value with a preset first threshold, and the result of the comparing indicates that the approximate degree does not reach a preset number Obtaining, according to the second tracking information of the current moment, a first current relative position parameter of the target object with respect to the tracking device, and determining whether the first current relative position parameter satisfies the first a preset condition, when the first current relative position parameter satisfies the first preset condition, determining that the visual tracking is invalid and the wireless signal tracking is valid; and/or, when the value of the approximation is less than a preset Obtaining, according to the second tracking information of the current time, a second current relative position parameter of the target object with respect to the tracking device, and determining whether the second current relative position parameter satisfies the second And a preset condition, when the second current relative position parameter satisfies the second preset condition, determining that the visual tracking is valid and the wireless signal tracking is invalid.
  • the determining unit 150 is configured to calculate a first distance between the first current relative position parameter and the first historical relative position parameter, wherein the first historical relative position parameter is based on Determining a relative position parameter of the target object relative to the tracking device determined by the second tracking information detected by one or more historical moments before the current time, or the first historical relative position parameter is based on the current time
  • the second tracking information detected by the at least two historical moments determines an average of the relative position parameters of the target object relative to the tracking device.
  • the determining unit 150 is further configured to calculate a second distance between the second current relative position parameter and the second historical relative position parameter, wherein the second historical relative position parameter is based on the previous time
  • the relative position parameter of the target object relative to the tracking device determined by the first tracking information detected by the one or more historical moments, or the second historical relative position parameter is based on at least two history before the current moment Determining, by the first tracking information, the average value of the relative position parameter of the target object relative to the tracking device; when the second distance is less than the second specific value, determining that the second current relative position parameter is satisfied The second preset condition.
  • the determining unit 150 is configured to maintain a current acquisition direction of the tracking device or maintain a current signal detection direction of the tracking device when the signal detection direction is consistent with the acquisition direction.
  • the determining unit is further configured to: when the signal detection direction is inconsistent with the acquisition direction, obtain a relative position parameter of the target object relative to the tracking device based on the second tracking information, based on the relative position Determining the current acquisition direction according to the parameter and the signal detection direction, or obtaining a relative position parameter of the target object relative to the tracking device based on the first tracking information, based on the relative position parameter and the collecting Direction, determining the direction of signal detection.
  • the determining unit 150 determines whether the acquisition direction and/or the signal detection direction need to be adjusted according to the current acquisition direction and the signal detection direction, so that the tracking of the recovery tracking mode is effective.
  • the determining unit 150 is configured to perform at least one of the following:
  • the first tracking information is the final tracking information when the visual tracking is valid and the wireless signal tracking is invalid;
  • the final tracking information is calculated by using a dependent variable of the first tracking information and the second tracking information as a preset function relationship.
  • the determining unit 150 is configured to obtain first tracking information obtained based on visual tracking during the visual tracking failure and in tracking the target object based on the second tracking information, and calculate the Deriving the approximate degree of the first tracking information and the second tracking information, comparing the calculated value of the approximate degree with a preset second threshold, and the result of the comparing indicates that the approximate degree reaches a preset number When the two approximations are required, it is determined that the visual tracking is restored to be effective.
  • the determining unit 150 is further configured to obtain tracking based on wireless signals in the process of tracking the target object based on the first tracking information when the visual tracking is valid and the wireless signal tracking is invalid. Obtaining the second tracking information, and calculating an approximate degree of the first tracking information and the second tracking information, comparing the calculated value of the approximate degree with a preset third threshold, and the result of the comparison When it is characterized that the approximation reaches a preset third approximation requirement, it is determined that the wireless signal tracking is restored to be valid.
  • the determining unit 150 may be configured to collect a tracking direction of the tracking image based on a signal detection direction of the wireless signal tracking and the visual tracking when the visual tracking fails and the wireless signal tracking is valid, Determining a current acquisition direction such that the visual tracking acquires a tracking image including the target object, or, when the visual tracking is valid and the wireless signal tracking fails, a signal detection direction based on the wireless signal tracking and The visual tracking acquires the acquisition direction of the tracking image, determines the signal detection direction of the wireless signal tracking, and then adjusts the tracking direction by the adjustment of the acquisition direction or the signal detection direction, and then verifies whether the tracking is performed by the calculation of the approximation degree. Recovery is valid.
  • the determining unit 150 can also correspond to the processor or the processing circuit. By adjusting the acquisition direction or the signal detection direction, the failure tracking mode can be restored.
  • the determining unit 150 is configured to maintain a current acquisition direction of the tracking device when the signal detection direction is consistent with the acquisition direction, or when the signal detection direction is consistent with the acquisition direction, Maintaining a current signal detection direction of the tracking device; and/or the recovery unit is further configured to: when the signal detection direction is inconsistent with the acquisition direction, obtain the target object based on the second tracking information Determining the current location of the tracking device based on the relative location parameter and the signal detection direction; or, when the signal detection direction is inconsistent with the acquisition direction, based on the first tracking information Obtaining a relative position parameter of the target object with respect to the tracking device, and determining the signal detection direction based on the relative position parameter and the acquisition direction.
  • the tracking device further includes:
  • a first verification unit configured to determine a current acquisition parameter of the image acquisition according to the wireless signal detection direction and the image tracking direction of the visual tracking when the visual tracking fails and the wireless signal tracking is valid; Extracting, according to the current tracking image acquired by the current acquisition parameter, an candidate image region from the current tracking image based on the first image feature of the target object; and extracting an candidate for the graphic object in the candidate image region An image feature; calculating a degree of matching between each of the candidate image features and a second image feature of the target object, wherein the second graphic feature includes the first image feature; when the matching degree is detected to be greater than The four-threshold candidate image feature is determined to be valid for the visual tracking recovery.
  • the tracking device further includes: a second verification unit configured to detect a direction of the signal determined based on the first tracking information when the visual tracking is valid and the wireless signal tracking fails Detecting the second tracking information, re-determining the relative position parameter of the target object with respect to the tracking device, wherein the relative position parameter comprises: a relative distance and/or a relative angle; determining whether the relative distance is less than a fifth threshold, And/or determining whether the relative angle is less than a sixth threshold; determining that the wireless signal tracking recovery is valid when the relative distance is less than a seventh threshold, and/or the relative angle is less than an eighth threshold.
  • a second verification unit configured to detect a direction of the signal determined based on the first tracking information when the visual tracking is valid and the wireless signal tracking fails Detecting the second tracking information, re-determining the relative position parameter of the target object with respect to the tracking device, wherein the relative position parameter comprises: a relative distance and/or a relative angle; determining whether the relative distance is less than a fifth threshold, And/or
  • the first verification unit and the second verification unit may correspond to the processor or the like, and according to the detected tracking information, verify whether the tracking of the corresponding tracking mode is valid or effective by the calculation of the first distance and/or the second distance.
  • the embodiment provides a tracking device, including:
  • the image acquisition module 210 further collects a tracking image
  • the antenna module 220 further detects the wireless signal
  • the memory 230 is further stored with a computer program
  • the processor 240 is connected to the image acquisition module 210, the antenna module 220, and the memory 230, respectively, for performing the target tracking method provided by any one of the foregoing embodiments by executing the computer program.
  • the image acquisition module 210 can include one or more cameras.
  • the antenna module 220 can include one or more antennas that can receive wireless signals.
  • the memory 230 can include various storage media, such as non-transitory storage media, that can be used to store the computer program.
  • the processor 240 can be a central processing unit, a microprocessor, an application processor, a digital signal processor, or a programmable array, etc., and can be connected to the image acquisition module 210, the antenna module 220, and the memory 230 via the bus 250.
  • the bus can be an integrated circuit (IIC) bus or the like.
  • the processor 240 can perform the target tracking method provided by any one of the foregoing technical solutions by executing the computer program, and specifically, the method shown in FIG. 1 and/or FIG.
  • the embodiment of the present application further provides a computer storage medium, where the computer storage medium stores a computer program; after the computer program is executed by the processor, the target tracking method provided by any one of the foregoing embodiments can be executed.
  • the computer storage medium herein may be a non-transitory storage medium, and may specifically be various types of storage media such as an optical disk, a flash memory, or a mobile hard disk.
  • Tracking the target object and retrieving the tracking object refers to the process of tracking a target object for a long time, because the tracker is lost due to various disturbances, and the tracked target object or the object to be disturbed is taken away, Re-find the initial set tracking target and resume normal tracking.
  • an interfering object is inserted between the tracked target object and the tracking device, causing the captured target image to disappear from the tracking image.
  • the target object enters a strong light area, the light is too strong, and the tracking device has not been able to adjust the acquisition parameters such as the collection brightness and the acquisition contrast according to the current ambient brightness in time, thereby causing the acquisition tracking image to be overexposed.
  • the phenomenon that the target object cannot be successfully extracted from the tracking image.
  • the training model can be used to track the target object in the subsequent video according to the target object template defined in the initial frame picture, and the model is continuously updated during the tracking process to achieve the posture change of the target object and overcome the complex background interference.
  • the target object template herein may include: one or more image features of the target object; or a feature extraction model after the target object is imaged, and the like.
  • this type of technology is highly versatile and can track any object specified by the user; however, one of the main problems with long-term tracking algorithms is that it is impossible to accurately determine whether the target object is lost during the tracking process. And the target object is accurately retrieved after the target object is lost. Moreover, the target object retrieving based on visual tracking is easily interfered by similar target objects and environments, and it is difficult to form a robust long-term tracking system.
  • a UWB tracking is introduced to assist visual tracking to reduce the phenomenon that the target object is lost.
  • UWB is a carrierless communication technology that uses N-second to picosecond non-sinusoidal narrow pulses to transmit data. It can use its sub-nanosecond ultra-narrow pulse for close-range accurate indoor positioning or target tracking with anti-interference performance. Strong, no carrier, low transmission power and high tracking accuracy, the tracking accuracy can be as high as 10 cm.
  • a target tracking method based on UWB tracking and visual tracking is used to solve the problem of target object loss and rediscovery in the tracking process.
  • the target object to be followed needs to carry the UWB beacon. After starting tracking, the visual tracking information is obtained by visual tracking, and the robot is controlled to visually track the target object.
  • the control signal includes the target object distance at each time t distance. And the relative angle of the target object to the robot
  • the UWB beacon is a transmitting device that transmits a UWB signal.
  • the S conf calculated by the above formula is actually the positional distance of the target object in space calculated by UWB and vision, and the threshold is set for the distance. To determine whether the two sets of data are consistent, when S conf is greater than the threshold When the target object has been mistaken, the consistency check here can actually cover the case where one of the tracking data is invalid. For example, if the target object is lost in visual tracking, the distance and angle obtained at this time will remain at zero. At this point, the calculated S conf will be very large; the consistency check of the design can actually solve the problem of accurate judgment of the target object in pure visual tracking. For a single visual tracking system, most of the systems Based on the template of online update, after the target object is wrong, it is taken away by the wrong target object. Simply adding visual-based information is difficult to solve this problem, and too complicated calculation is difficult to apply in real-time system.
  • the position verification proposed here utilizes additional sensor information, which can be accurately determined without increasing the computational complexity;
  • the validity of the two sets of tracking data is first checked. If the UWB data is valid (continuous frames are changed), UWB tracking is enabled, and the UWB tracking is enabled. with Verification as tracking information, continuing until conditions are met When enabled, visual tracking is performed to verify the target object.
  • the significance of setting the threshold for the angle is that when the robot correctly and stably tracks the target object, its posture should be directly opposite to the target object being tracked. At this time, the obtained angle should be close to zero.
  • the target object and the tracking device are clipped when the tracking angle based on the currently valid UWB tracking is less than a specific value.
  • the angle is small, and the probability of the target object appearing in the tracking image is high, so that the number of verifications of the tracking recovery can be reduced.
  • the visual tracking method is enabled for verification, and the vision-based target object re-identification method is used to verify whether the tracked target object is consistent with the previous tracking target object, and after confirming the consistency, the normal visual algorithm tracking state is restored. ;
  • the target tracking method combining visual tracking and UWB tracking is given below with reference to FIG. 7, which includes:
  • Step S1 selecting a target object and determining that the target object has carried the UWB beacon;
  • Step S2 performing visual tracking and UWB tracking
  • Step S3 obtaining tracking distance and tracking angle verification of visual tracking and UWB tracking
  • Step S4 Based on the result of the verification, it is determined whether the tracking is the same object. If the process returns to step S2, if not, proceed to step S5:
  • Step S5 maintaining UWB tracking
  • Step S6 verifying the visual tracking based on the tracking distance and the tracking angle of the UWB tracking
  • Step S7 It is judged whether or not the tracking is the same object. If the process proceeds to step S2, the process returns to step S5.
  • Step 1 The ground robot tracks pedestrians.
  • Step 2 The ground robot is equipped with a UWB module and a depth camera and an RGB camera;
  • Step 3 The target object being tracked is equipped with a UWB beacon
  • Step 4 Tracking the video frame collected by the red, green and blue (RGB) camera, and calculating the angle of the target object relative to the robot according to the position of the target object in the image, and the depth information obtained by the depth camera is used to calculate the distance between the target object and the robot;
  • RGB red, green and blue
  • Step 4 Visual tracking using visual tracking algorithm or tracking model also includes various states in the target object tracking process, such as missing target object, target object retrieving, etc. When the target object is lost, the given distance and angle are given. At 0, the target object retrieves the target object verification for short-time UWB tracking.
  • UWB tracking and visual tracking can be combined to solve the problem that the target tracking object cannot be accurately judged in the pure visual tracking system; the UWB signal is used to supplement the visual tracking with short time tracking, and the long tracking process is solved.
  • the target object is retrieving the difficult problem.
  • the disclosed apparatus and method may be implemented in other manners.
  • the device embodiments described above are merely illustrative.
  • the division of the unit is only a logical function division.
  • there may be another division manner such as: multiple units or components may be combined, or Can be integrated into another system, or some features can be ignored or not executed.
  • the coupling, or direct coupling, or communication connection of the components shown or discussed may be indirect coupling or communication connection through some interfaces, devices or units, and may be electrical, mechanical or other forms. of.
  • the units described above as separate components may or may not be physically separated, and the components displayed as the unit may or may not be physical units, that is, may be located in one place or distributed to multiple network units; Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing module, or each unit may be separately used as one unit, or two or more units may be integrated into one unit;
  • the unit can be implemented in the form of hardware or in the form of hardware plus software functional units.
  • the technical solution provided by the embodiment of the present application visual tracking and wireless signal tracking are simultaneously used when tracking the target object; thus, two tracking methods can be used for accurate tracking to reduce the target object and the phenomenon of being lost. Industrial effect.
  • the technical solution provided by the embodiment of the present application has the characteristics of simple implementation, can be widely used in industry, and has the characteristics of great industrial application prospect.

Abstract

本申请实施例公开了一种目标跟踪方法、目标跟踪设备及计算机存储介质,应用于目标跟踪设备的所述目标跟踪方法包括:采集跟踪图像;基于所述跟踪图像,获取对目标对象进行视觉跟踪的第一跟踪信息;检测进行所述目标跟踪的无线信号;解析所述无线信号,获取对所述目标对象进行无线信号跟踪的第二跟踪信息;结合所述第一跟踪信息及所述第二跟踪信息,确定所述目标对象的最终跟踪信息。

Description

目标跟踪方法、目标跟踪设备及计算机存储介质
相关申请的交叉引用
本申请基于申请号为201710374093.8、申请日为2017年05月24日的中国专利申请提出,并要求该中国专利申请的优先权,该中国专利申请的全部内容在此引入本申请作为参考。
技术领域
本申请涉及电子技术领域但不限于电子技术领域,尤其涉及一种目标跟踪方法、目标跟踪设备及计算机存储介质。
背景技术
随着电子技术的发展为了提供随身服务,跟踪设备利用视觉系统等通过图像采集进行目标跟踪;但是在相关技术中,通常会存在多种干扰因素,一方面干扰因素容易导致目标被跟丢;另一方面,一旦干扰因素导致跟踪目标丢失,就很难重新找回跟踪目标。
发明内容
有鉴于此,本申请实施例期望提供一种目标跟踪方法、目标跟踪设备及计算机存储介质。
本申请的技术方案是这样实现的:
本申请实施例第一方面提供一种目标跟踪方法,应用于目标跟踪设备,包括:
采集跟踪图像;
基于所述跟踪图像,获取对目标对象进行视觉跟踪的第一跟踪信息;
检测进行所述目标跟踪的无线信号;
解析所述无线信号,获取对所述目标对象进行无线信号跟踪的第二跟踪信息;
结合所述第一跟踪信息及所述第二跟踪信息,确定所述目标对象的最终跟踪信息。
本申请实施例第二方面提供一种目标跟踪设备,包括:
采集单元,配置为采集跟踪图像;
第一获取单元,配置为基于所述跟踪图像,获取对目标对象的视觉跟踪的第一跟踪信息;
检测单元,配置为检测进行所述目标跟踪的无线信号;
第二获取单元,配置为解析所述无线信号,获取对所述目标对象进行无线信号跟踪的第二跟踪信息;
确定单元,配置为结合所述第一跟踪信息及所述第二跟踪信息,确定所述目标对象的最终跟踪信息。
本申请实施例第三方面提供一种目标跟踪设备,包括:
图像采集模组,配置为采集跟踪图像;
天线模组,配置为检测无线信号;
存储器,配置为存储有计算机程序;
处理器,分别与所述图像采集模组、天线模组及存储器连接,配置为通过执行所述计算机程序执行前述任意一个或多个技术方案提供的目标跟踪方法。
本申请实施例第四方面提供一种计算机存储介质,所述计算机存储介质存储有计算机程序;所述计算机程序被处理器执行后,能够执行前述任意一个或多个技术方案提供的目标跟踪方法。
本申请实施例提供的目标跟踪方法、目标跟踪设备及计算机存储介质,
首先,目标跟踪设备会利用两种跟踪方式对同一目标对象进行跟踪,这样当其中一个跟踪失效时,只要另一个跟踪有效,就保持了对目标对象的有效跟踪,从而解决了单一跟踪方式导致的跟踪丢失率高的问题,提升了跟踪成功率;
其次,当一种方式跟踪失效,另一种跟踪方式有效,则可以利用跟踪有效的跟踪方式的跟踪信息,使得跟踪失效的跟踪方式恢复到有效,重而实现对目标对象的重找回,重新恢复失效的跟踪方式,从而降低了目标对象重找回的难度;
再次,在本实施例中目标跟踪设备同时使用的跟踪方式,一种为视觉跟踪,通过跟踪图像的采集及分析获得跟踪信息,另一种无线信号跟踪,通过无线信号检测获得跟踪信息,这两种跟踪方式差异大,都有特点和适用的跟踪场景,实现了跟踪方式的相互辅助,两种跟踪方式同时失效的概率低,从而再次降低了目标对象跟丢的概率,再次提升了跟踪成功率。
附图说明
图1为本申请实施例提供的第一种目标跟踪方法的流程示意图;
图2为本申请实施例提供的第二种目标跟踪方法的流程示意图;
图3为本申请实施例提供的一种跟踪图像的显示效果示意图;
图4为图3包括的备选跟踪区域的显示效果图;
图5为本申请实施例提供的一种跟踪设备的结构示意图;
图6为本申请实施例提供的另一种跟踪设备的结构示意图;
图7为本申请实施例提供的第三种目标跟踪方法的流程示意图。
具体实施方式
以下结合说明书附图及具体实施例对本申请的技术方案做进一步的详细阐述。
本实施例提供一种目标跟踪方法,应用于目标跟踪设备,如图1所示,该方法包括:
步骤S110:采集跟踪图像;
步骤S120:基于所述跟踪图像,获取对目标对象的视觉跟踪的第一跟踪信息;其中,所述第一跟踪信息即为对所述目标对象进行视觉跟踪的第一跟踪信息;
步骤S130:检测进行所述目标跟踪的无线信号;
步骤S140:解析所述无线信号,获取对所述目标对象进行无线信号跟踪的第二跟踪信息;
步骤S150:结合所述第一跟踪信息及所述第二跟踪信息,确定所述目标对象的最终跟踪信息。
本实施例提供的目标跟踪方法,可为应用于目标跟踪设备的信息处理方法。所述目标跟踪设备可为在地面移动的移动机器人,也可以在空中飞行的飞行机器人(如无人机)。
在本实施例中所述目标跟踪设备集成有采集模组及无线信号检测模组;所述采集模组可以用于进行图形采集,从而实现视觉跟踪;所述无线信号检测模组可以用于通过跟踪目标对象的无线信号的收发,进行无线信号跟踪,可至少包括无线信号的接收天线,在一些实施例中还可包括发送无线信号的发射天线。在本实施例中所述无线信号,可为UWB信号,所述UWB可为Ultra Wideband(超宽带)的缩写,所述UWB信号为一种时间长度为预定长度的脉冲信号。所述预定长度可为纳秒至微微秒级的信号。所述脉冲信号可为非正弦脉冲信号。
在一些实施例中,所述无线信号可为各种射频信号。
在本实施例中所述步骤S110采集跟踪图像,在步骤S120中通过对采集的跟踪图像的解析,可以获得所述第一跟踪信息;这里的第一跟踪信息, 可包括:被跟踪的目标对象相对于目标跟踪设备的距离和角度等各种以目标跟踪设备为参考对象的相对位置信息。
在步骤S120中基于采集的跟踪图像,提取所述第一跟踪信息时,可以解析所述跟踪图像,确定所述目标对象在跟踪图像中的成像位置、成像面积等参数,然后基于采集参数(例如,跟踪设备采集图像的采集方向、对焦位置等)可以换算出所述目标对象相对于目标跟踪设备的距离和/或角度。
在一些实施例中采集所述跟踪图像的摄像头可由深度摄像头及红绿蓝(RGB)摄像头共同组成。深度摄像头及RGB摄像头分别采集图像,然后在分析深度摄像头采集的深度图像和RGB摄像头采集的RGB图像时,结合深度摄像头和RGB摄像头的位置关系,确定出目标对象在RGB图像和深度图像上的映射关系,进而定位出所述目标对象相对于所述跟踪设备的距离和/或角度等所述第一跟踪信息。
目标对象可为各种跟踪的对象,可包括:人或动物等各种被追踪的对象,通常情况下,所述目标对象可为生命体,在一些情况下,所述目标对象还可以是另一个移动设备。利用跟踪设备去跟踪另一个可以移动的设备,实现对设备的跟踪。
在步骤S130中会检测所述无线信号,在本实施例中所述步骤S130的可实现方式可包括多种:
第一种:接收所述目标对象随身携带的发射设备发送的所述无线信号;
第二种:跟踪设备发射第一信号,检测基于所述第一信号反射回的第二信号;所述第一信号和所述第二信号都为无线信号的一种,所述目标跟踪设备需要检测第一信号作用到目标对象之后反射回的第二信号。
总之,不管采用哪种方式在步骤S130中都会接收到从目标对象处传来的所述无线信号。然后根据所述无线信号的接收参数,例如,接收方向、相位等可以定位出所述目标对象相对于所述跟踪设备的方向和距离的参 数,再例如,根据接收到无线信号的信号强度和/或接收功率,再结合预先知道的发射强度和/或发射功率,利用传输损耗模型可以确定出所述目标对象相对于所述跟踪设备的距离和/或角度等信息,从而得到所述第二跟踪信息。
在一些实施例中,所述无线信号跟踪可以基于目标对象与目标跟踪设备之间传输的无线信号,采用双向侧测距(Two-way Ranging,TWR)的方式,来计算获得目标对象与目标跟踪设备之间的相对距离;TWR是一种双向测距的方法,两个目标设备和跟踪设备上分别设置的通信单元根据相互发送和接收信号的时间差值推算信号飞行时间,并能依此计算通信单元之间的相对距离。
所述无线信号跟踪也可以基于目标对象与目标跟踪设备之间传输的无线信号,采用达到相位差(Phase Difference of Arrival,PDOA)的方式,来计算获得目标对象与目标跟踪设备之间的相对距离和相对夹角;PDOA简称信号到达相位差,是一种利用相位差进行定位的方法,通过测量信号到达监测站的相位差,可以确定信号源和监测站的相对距离和角度。
在本实施例中所述步骤S110与所述步骤S130之间没有一定的先后顺序,所述步骤S110可以先于所述步骤S130,也可以是所述步骤S130先于所述步骤S110,还可使所述步骤S110和步骤S130同时执行的。
在本实施例中所述步骤S110至步骤S120可是按照第一时间间隔反复执行的,所述步骤S130至步骤S140可是按照第二时间间隔反复执行的。所述第一时间时间间隔的任意相邻两个时间间隔可以相等,从而实现周期性执行;所述第二时间间隔的任意两个时间间隔也可以相等,也实现周期性执行。
当然,任意所述第一时间间隔和/或所述第二时间间隔,也可以不相等。所述第一时间间隔和所述第二时间间隔可为动态的确定的。例如,基于历 史时刻的第一跟踪信息确定出所述目标对象的移动速率满足第一速率条件时,例如大于第一预设速率阈值时,则缩短所述第一时间间隔,若不满足所述第一速率条件,则可以增大所述第一时间间隔。这样显然,第一时间间隔是动态确定的。
再例如,基于历史时刻的第二跟踪信息确定出所述目标对象的移动速率满足第二速率条件时,例如大于第二预设速率阈值时,则缩短所述第二时间间隔,若不满足所述第二速率条件,则可以增大所述第二时间间隔。这样显然,第二时间间隔是动态确定的。
在一个实施例中,所述步骤S150的执行频次,决定于第一跟踪信息和第二跟踪信息的获得频次较低的那一个。
在一个实施例中,所述步骤S110至步骤S120的执行时间间隔等于所述步骤S130至步骤S140的执行时间间隔,即所述第一跟踪信息和所述第二跟踪信息的是同步获得的。
在本实施例中首先在进行目标对象跟踪时,引入了两种跟踪方式来进行跟踪,且这两种跟踪方式,一种是视觉跟踪,通过图像的采集进行目标跟踪;另一种是基于无线信号的检测进行目标对象的跟踪;最终根据这两种跟踪方式得到的两类跟踪信息(即第一跟踪信息和第二跟踪信息),确定目标对象的最终跟踪信息。
如此首先,当一个跟踪方式失效时(即目标对象丢失),另一种跟踪方式还能够提供跟踪信息,相对于目标跟踪设备而言两种跟踪方式同时失效的概率相对于一种跟踪失效的概率低,故相对于目标跟踪设备而言,降低了跟踪失效的概率,提升了跟踪成功率。
其次,在本实施例中采用的两种跟踪方式,一个是视觉跟踪,另一个是无线信号跟踪,这两种跟踪方式差异大,可以适用于不同的跟踪场景,在视觉跟踪容易失效的情况下,无线信号跟踪的跟踪能力强,在无线信号 跟踪容易失效的情况下,所述视觉跟踪可能是跟踪能力强,故进一步降低了两种跟踪方式同时失效的问题,再次提升了目标跟踪的成功率。
在一些实施例中,两种跟踪方式可以分别利用各自的跟踪算法,确定当前时刻是否出现跟踪无效的状况。
例如,在视觉跟踪中,通过检测当前采集的跟踪图像中是否成功定位到目标对象的成像,确定所述视觉跟踪是否有效。
例如,在无线信号跟踪中,通过当前检测时段内是否有检测到所述无线信号,确定无线信号跟踪是否有效。
再例如,基于第一跟踪信息,确定目标对象相对于跟踪设备的距离和/或角度;当距离小于某一个距离指定值,和/或,角度小于某一个角度指定值时,认为视觉跟踪有效,否则视觉跟踪失效。
再例如,基于第二跟踪信息,确定目标对象相对于跟踪设备的距离和/或角度;当距离小于某一个距离指定值,和/或,角度小于某一个角度指定值时,认为无线信号跟踪有效,否则无线信号跟踪失效。
以上是分别判断两种跟踪方式是否有效,然后各自的跟踪是否有效,确定所述跟踪对象的最终跟踪信息的方式。但是在一些实施例中,为了提升判断两种的跟踪是否有效,本实施例提供一种优选方式,如图2所示,所述步骤S150可包括:
步骤S151:结合所述第一跟踪信息及所述第二跟踪信息,判断所述视觉跟踪和所述无线信号跟踪是否有出现目标丢失;
步骤S152:根据所述判断的判断结果,确定所述最终跟踪信息。
在本实施例中在判断视觉跟踪及无线信号跟踪是否有效,会同时结合第一跟踪信息和第二跟踪信息来判断,以提升的判断的准确率。最终根据判断结果,确定最终跟踪信息。在本实施例中所述最终跟踪信息,同样可包括:目标对象相对于跟踪设备的距离和/或角度的信息。
例如在一些实施例中,所述步骤S151,可包括:计算所述第一跟踪信息和所述第二跟踪信息的近似度;
所述步骤S152,可包括:将计算所得近似度的取值与预设的第一阈值进行比较,在所述比较的结果表征所述近似度未达到预设的第一近似度要求时,确定所述视觉跟踪和所述无线信号跟踪的至少一个出现目标丢失。
在本实施例中判断所述近似度是否未达到第一近似度要求,则可包括:该近似度为两个跟踪信息指示的位置差,该位置差不小于一个特定阈值,若小于特定阈值,可认为达到第一近似度要求,否则认为未达到所述近似度要求。该特定阈值可为所述第一阈值的一种。
在本实施的步骤S151中会计算第一跟踪信息和第二跟踪信息的近似度,及基于近似度进行判断的方式也有多种,以下提供几种可选方式:
方式一:
若所述第一跟踪信息和所述第二跟踪信息都包括:各种跟踪方式检测的目标对象相对于目标跟踪设备的距离和角度;计算所述近似度,可包括:计算距离差,计算角度差,则在步骤S152则可包括:距离差不小于预设距离差,和/或,角度差不小于预设角度差,则可认为两种跟踪方式的至少一个出现目标丢失,即跟踪失效。
方式二:所述步骤S151可包括:
若所述第一跟踪信息和所述第二跟踪信息都包括:各种跟踪方式检测的目标对象相对于目标跟踪设备的距离和角度,构建以所述目标跟踪设备为原点的坐标系,基于所述第一跟踪信息,确定所述坐标系中的第一坐标值;根据所述第二跟踪信息确定所述坐标系中的第二坐标值;基于所述第一坐标值及所述第二坐标值构建差异向量,计算所述差异向量的模值。
所述步骤S152可包括:当所述模值不小于预定模阈值时,可认为视觉跟踪和无线信号跟踪的至少一个目标丢失。
为了确保判断的精确性,在本实施例中提供一种优选方式,会通过采集的多个第一跟踪信息和多个第二跟踪信息,来判断是否出现跟踪失效。例如,所述步骤S151可包括:
基于所述第一跟踪信息,计算所述目标对象在第一预设时间内相对于所述目标跟踪设备的第一平均距离和第一平均角度;在第一预设时间内,至少两次获取了所述第一跟踪信息;
基于所述第二跟踪信息,计算所述目标设备在第一预设时间内相对于所述目标跟踪设备的第二平均距离和第二平均角度;在所述第一预设时间内至少两次获取了所述第二跟踪信息;
基于所述第一平均距离、所述第二平均距离、所述第一平均角度及所述第二平均角度,计算所述近似度。
这样基于第一跟踪信息,可以计算出第一预设时间内目标对象相对于目标跟踪设备的第一平均距离和第一平均角度;基于第二跟踪信息,可以计算出第一预设间内目标对象相对于目标跟踪设备的第一平均距离和第二平均角度。
在本实施例中所述结合两个平均距离和两个平均角度,共同计算所述近似度,
例如,所述第一平均距离可表示为
Figure PCTCN2018088020-appb-000001
所述第一平均角度可表示为
Figure PCTCN2018088020-appb-000002
所述第二平均距离可表示为
Figure PCTCN2018088020-appb-000003
所述第二平均角度可表示为
Figure PCTCN2018088020-appb-000004
可利用如下公式计算所述近似度S conf
Figure PCTCN2018088020-appb-000005
在一些实施例中还可以直接根据第一平均距离和第二平均距离的平均距离差,及第一平均角度及第二平均角度的平均角度差,来结合判断是否有至少一个目标丢失。
为了减少计算量,在进一步确定是视觉跟踪有效还是无线信号跟踪失 效,则可以先基于一个跟踪信息确定出对应的跟踪方式是否有效,若对应的跟踪方式有效,则另一个跟踪方式失效。
第一种方式,所述步骤S152,还包括:
将计算所得近似度的取值与预设的第一阈值进行比较,在所述比较的结果表征所述近似度未达到预设的第一近似度要求时,直接确定所述视觉跟踪无效且所述无线信号跟踪有效。
第二种方式,所述步骤S152,还包括:
将计算所得近似度的取值与预设的第一阈值进行比较,在所述比较的结果表征所述近似度未达到预设的第一近似度要求时,判断在第二预设时间内的所述第二跟踪信息是否存在连续性变化,如果判断结果为是,则确定所述无线信号跟踪有效且所述视觉跟踪无效,如果判断结果为否,则确定所述视觉跟踪有效且所述无线信号跟踪无效。
在本实施例中判断第二跟踪信息是否存在连续性变化,是基于目标对象的移动的连续性而言的,可包括:
第二跟踪信息在第二预设时间内存在连续性变化,目标跟踪设备能够多次获得第二跟踪信息,且多次获得的第二跟踪信息是有变化的,这样就保持了目标跟踪设备与目标对象之间的持续性跟踪,且表明了在多次跟踪信息的获取时间内,所述目标对象依然在所述目标跟踪设备的跟踪范围内,故目标跟踪设备当前对目标对象的跟踪是持续性的有效的跟踪。
在另一些实施例中,为了进一步提升跟踪有效的判断精确性,本实施例中所述判断第二跟踪信息是否存在连续性变化,还可包括:
根据当前时刻的所述第二跟踪信息,获取所述目标对象相对于所述跟踪设备的第一当前相对位置参数,并确定所述第一当前相对位置参数是否满足第一预设条件,当所述第一当前相对位置参数满足第一预设条件,则确定所述视觉跟踪失效且所述无线信号跟踪有效。在这里的第一预设条件 可包括:跟踪设备在持续跟踪的过程中,保持自身预定速度范围内的速度移动,而当前第一当前相对位置参数表明目标对象与跟踪设备之间的距离始终保持在一定范围内,没有出现突然之间距离非常大或相对于跟踪设备距离为0的情况。
在本实施例中近似度可表征两个跟踪信息的相似度。
第三种方式,所述步骤S152还可包括:
当所述近似度的取值小于预设的第一阈值时,根据当前时刻的所述第二跟踪信息,获取所述目标对象相对于所述跟踪设备的第二当前相对位置参数,并确定所述第二当前相对位置参数是否满足第二预设条件,当所述第二当前相对位置参数满足第二预设条件,则确定所述视觉跟踪有效且所述无线信号跟踪失效。
在一些实施例中,所述第二种方式和所述第三种方式结合使用。例如,在第二种方式下,判断出所述第一当前位置不满足所述第一预设条件时,进一步利用第三种方式进行判断,第一当前相对位置不满足第一预设条件,且第二当前相对位置满足第二预设条件时,确定所述视觉跟踪有效且所述无线信号跟踪失效。
在一些实施例中,第二种方式和第三种方式的结合使用,还可包括:
第二当前相对位置不满足第二预设条件,且第一当前相对位置满足第一预设条件时,确定所述视觉跟踪无效且所述无线信号跟踪有效。
在另一些实施例中,当两种方式结合使用过程中,出现判断冲突时,可以根据视觉跟踪和无线信号跟踪的置信度,确定最终的判断结果。例如,假设视觉跟踪的置信度高于无线信号的置信度,则确认判定视觉跟踪有效而无线信号跟踪无效的判断结果为最终判断结果。假设视觉跟踪的置信度不高于无线信号的置信度,则确认判定视觉跟踪无效而无线信号跟踪有效的判断结果为最终判断结果。
这里的两种跟踪方式的置信度可为从其他设备接收的,也可以是跟踪设备自身以往的跟踪失效的统计,确定的;例如,根据历史统计结果,视觉跟踪失效的频次高于无线信号跟踪的频次,则确定视觉跟踪的置信度低于无线信号跟踪的跟踪度,否则确定无线信号的置信度高于视觉跟踪的置信度。
在本实施例中所述置信度可为表征对应跟踪方式当前给出的跟踪参数的可信程度。
在一些实施例中,所述确定所述第一相对位置参数是否满足第一预设条件,包括:
计算所述第一当前相对位置参数与第一历史相对位置参数之间的第一距离,其中,所述第一历史相对位置参数为基于所述当前时刻之前的一个或多个历史时刻检测的第二跟踪信息确定的所述目标对象相对于所述跟踪设备的相对位置参数,或,所述第一历史相对位置参数为基于所述当前时刻以前的至少两个历史时刻检测的第二跟踪信息确定的所述目标对象相对于所述跟踪设备的相对位置参数的平均值;
当所述第一距离小于第一特定值时,确定所述第一当前相对位置参数满足所述第一预设条件。
目标对象的移动是有一定连续性的,移动速率一般情况下不会发生急剧的突变,在本实施例中利用此原理,将当前计算的第一当前位置参数(至少可包括:当前距离,在另一些实施中还可包括:当前角度),与第一特定值比较,若小于第一特定值,则可认为当前第一当前位置参数满足第一预设条件,否则可认为不满足。
例如,当前时刻为t0,历史时刻1为t1,历史时刻2为t2;分别计算t0时刻与t1和t2的第一距离,得到两个第一距离,分别计算得到的两个第一距离与所述第二特定值进行比较判断,在所有第一距离都小于所述第一 特定值时,可以确定第一当前位置参数满足所述第一预设条件。在一些实施例中,可以计算当前时刻检测的第二跟踪信息与一个或多个历史时刻检测的第二跟踪信息对应的相对位置信息得到多个第一距离,在多个第一距离中小于第二特定值的比值小于预设比值的时候,确定所述第一当前位置参数满足所述第一预设条件,相当于确定所述无线信号跟踪有效。
在有些实施例中,还可以基于多个第二跟踪信息,确定出多个历史时刻相对于跟踪设备的相对位置参数,然后计算多个历史时刻的相对位置参数的平均值,再计算第一当前位置参数相对于平均值的距离得到所述第一距离,若第一距离大于所述第一特定值,则可认为所述第一当前位置参数满足所述第一预设条件。
这里的第一特定值可为预先设定的值,可根据目标跟踪设备开始跟踪目标对象的初始距离确定的。在跟踪的过程中,所述目标跟踪设备会动态的调整自身的移动速率以保持与目标对象的移动速率的一致性,以使得所述目标对象始终在其跟踪范围内。则此时,所述第一阈值可为c倍于所述初始距离。这里的c可为小于1的正整数,取值可为0.1、0.2等取值。
所述确定所述第二当前相对位置参数是否满足第二预设条件,包括:
计算所述第二当前相对位置参数与第二历史相对位置参数之间的第二距离,其中,所述第二历史相对位置参数为基于所述当前时刻的前一个历史时刻检测的第一跟踪信息确定的所述目标对象相对于所述跟踪设备的相对位置参数,或所述第二历史相对位置参数为基于所述当前时刻以前的至少两个历史时刻检测的第一跟踪信息确定的所述目标对象相对于所述跟踪设备的相对位置参数的平均值;
当所述第二距离小于所述第二特定值时,确定所述第二当前相对位置参数满足所述第二预设条件。
在本实施例中无线信号跟踪是否有效的判断,同样也可以基于目标对 象移动的连续性进行判断。故在本实施例中所述第二特定值的确定,所述第二历史相对参数与第一历史相对参数确定类似,在此就不重复了。
在一些实施例中为了简化判断,所述第一特定值和第二特定值可以相同。在一些情况下,为了精确判断,所述第一特定值可是根据视觉跟踪的精度确定的,所述第二特定值可以是根据无线信号的精确度确定的,则此时,第二特定值和第一特定值可能会不同。
在本申请实施例中通过所述第一预设条件和/或第二预设条件的判断,来确定对应的跟踪方式是否有效,在一些情况下还可以事先确定目标跟踪设备不同跟踪方式的跟踪有效的分类器,所述分类器可为神经网络或向量学习机等。在判断对应的跟踪方式是否有效时,可以将当前时刻及与当前时刻相邻的若干个历史时刻检测的跟踪信息,输入到所述分类器;所述分类器自然就给出相应的判断结果,从而可以减少判断对应跟踪方式是否有效的计算量。
在训练所述分类器时,可以通过跟踪有效的跟踪信息作为正例样本进行训练,利用跟踪无效的跟踪信息作为负例样本进行训练,将通过验证的分类器作为后续跟踪设备的跟踪过程中的有效或失效的判断。
当其中一种跟踪方式出现了失效,即跟踪的目标对象丢失,出现失效的跟踪方式需要重新恢复跟踪。故在本实施例中,所述方法还包括以下步骤。
当前其中一个跟踪失效时,则需要根据跟踪有效的跟踪方式使得无效的跟踪方式调整跟踪图像采集的采集方向,或检测无线信号的信号检测方向,从而使得跟踪恢复有效,进而确保后续的跟踪精确度。例如,所述方法还包括一个或多个场景:
场景1:所述方法包括:
当所述视觉跟踪失效且所述无线信号跟踪有效时,基于所述无线信号 跟踪的信号检测方向及所述视觉跟踪采集所述跟踪图像的采集方向,确定所述跟踪图像的当前采集方向,以使所述视觉跟踪采集到包括所述目标对象的跟踪图像。
场景2:所述方法还包括:
当所述视觉跟踪有效且所述无线信号跟踪失效时,基于所述无线信号跟踪的信号检测方向及所述视觉跟踪采集所述跟踪图像的采集方向、确定所述无线信号跟踪的信号检测方向。
采集方向可为采集所述跟踪头像的摄像头的朝向;当所述摄像头的朝向移动了,则跟踪图像采集的跟踪图像的图像内容就会发生变化。
在步骤S130中检测的无线信号是从目标对象所在位置传输过来的,若检测所述无线信号的天线背向所述无线信号的传输方向,则可能出现无线信号漏检的现象。故在本实施例中会根据采集方向和信号检测方向,确定是否需要调整视觉跟踪的采集方向及第一调整参数。这里的第一调整参数可包括:第一调整量及第一调整方向。若当前调整参数可包括:调整量5度,还可包括:调整方向,是向左调整5度还是向右调整5度等。
当然,若无线信号跟踪失效,则根据当前的采集方向及无线信号检测方向,确定出是否需要调整信号检测方向及第二调整参数,这里的第二调整参数同样可包括:第二调整量及第二调整方向。
在一些情况下,为了简化所述跟踪设备对视觉跟踪及无线信号跟踪的控制,从一开始就会使得所述视觉跟踪的采集方向与无线信号跟踪的信号检测方向保持一致。以方便后续其中一个失效时,减少采集方向及信号检测方向的调整参数的计算及方向调整。例如,在所述跟踪设备内将摄像头和无线信号的检测天线集成设置在一个结构内,所述摄像头的采集方向和所述无线信号的信号检测方向始终保持一致。
故在一些实施例中,当所述信号检测方向与采集方向一致时,所述确 定当前采集方向,包括:维持所述跟踪设备的当前采集方向;当所述信号检测方向与采集方向一致时,所述确定所述无线信号跟踪的信号检测方向,包括:维持所述跟踪设备的当前信号检测方向。
在一些实施例中,摄像头的采集方向与信号检测方向可能并不一致,则此时需求基于采集方向及信号检测方向,共同进行调整参数的确定。在确定调整参数,以当前有效的跟踪方式对应的采集方向或信号检测方向为调整目标,确定出所述调整参数。例如,当所述信号检测方向与采集方向不一致时,所述确定当前采集方向,包括:
基于所述第二跟踪信息获得所述目标对象相对于所述跟踪设备的相对位置参数,基于所述相对位置参数及所述信号检测方向,确定所述当前采集方向;
当所述信号检测方向与采集方向不一致时,确定所述无线信号跟踪的信号检测方向,包括:基于所述第一跟踪信息获得所述目标对象相对于所述跟踪设备的相对位置参数,基于所述相对位置参数及所述采集方向,确定所述信号检测方向。
当采集方向调整或信号检测方向调整之后,需要对调整之后的跟踪进行验证,确定调整后的跟踪是否恢复有效。
在一些实施例中,所述步骤S150可,包括以下至少其中之一:
当所述判断结果表明所述视觉跟踪失效且所述无线信号跟踪有效时,确定所述第二跟踪信息为所述最终跟踪信息;
当所述判断结果表明所述视觉跟踪有效且所述无线信号跟踪无效时,确定所述第一跟踪信息为所述最终跟踪信息;
当所述判断结果表明所述视觉跟踪和所述无线信号跟踪均有效时,确定所述第一跟踪信息为所述最终跟踪信息;
当所述判断结果表明所述视觉跟踪和所述无线信号跟踪均有效时,以 所述第一跟踪信息和所述第二跟踪信息为预设函数关系的因变量计算所述最终跟踪信息。
总之,在步骤S150中若一个跟踪方式失效,另一个跟踪方式有效,则以有效的跟踪方式获得跟踪信息作为所述最终跟踪参数。
在步骤S150中当两个跟踪方式都有效时,可以结合两个跟踪方式的跟踪信息,通过均值计算等处理,计算出所述跟踪信息。
在另一些实施例中,若两个跟踪方式都有效,为了减少计算量,可以采用置信度较高的跟踪方式的跟踪信息直接作为最后的跟踪信息。例如,当所述无线信号跟踪为UWB信号跟踪时,在视觉跟踪及UWB跟踪都有效时,则可以直接利用UWB跟踪的跟踪参数作为最终跟踪信息。在一些实施例中,可以预先设定两种跟踪方式的优先级,例如,在UWB跟踪和视觉跟踪这种跟踪方式中,可以设置视觉跟踪的优先级高于UWB的优先级,则若两种跟踪方式都有效时,直接选择优先级高的跟踪方式的跟踪信息作为最后的跟踪信息。
当其中一种跟踪方式出现了失效,即跟踪的目标对象丢失,出现失效的跟踪方式需要重新恢复跟踪。
当所述判断结果表明视觉跟踪失效且无线信号跟踪有效时,确定第二跟踪信息为最终跟踪信息之后,所述方法还包括:
一方面,在基于所述第二跟踪信息跟踪目标对象的过程中,获得基于视觉跟踪获得的第一跟踪信息,并计算所述第一跟踪信息和所述第二跟踪信息的近似度,将计算所得近似度的取值与预设的第二阈值进行比较,在所述比较的结果表征所述近似度达到预设的第二近似度要求时,确定所述视觉跟踪恢复为有效。
另一方面,当所述判断结果表明所述视觉跟踪有效且所述无线信号跟踪无效时,确定所述第一跟踪信息为所述最终跟踪信息之后,所述方法还 包括:
在基于所述第一跟踪信息跟踪目标对象的过程中,获得基于无线信号跟踪获得的第二跟踪信息,并计算所述第一跟踪信息和所述第二跟踪信息的近似度,将计算所得近似度的取值与预设的第三阈值进行比较,在所述比较的结果表征所述近似度达到预设的第三近似度要求时,确定所述无线信号跟踪恢复为有效。
在本实施例中第一阈值、第二阈值及第三阈值的取值可以相同也可以不同。此处,基于第一跟踪信息和第二跟踪信息计算的近似度的方式可以与前述判断是否出现至少一个目标丢失的计算方式相同。
在一些情况下,为了减少近似度的计算次数,从而减少计算量,可以在根据所述第一跟踪信息或所述第二跟踪信息,确定出目标对象相对于跟踪设备的相对位置参数满足特定条件时,才触发前述近似度的计算。例如,将对应的近似度与第二阈值或第三阈值比较,以减少计算量,减少恢复验证次数。
在一些实施例中,判断所述视觉跟踪是否恢复有效,还可包括:
当所述视觉跟踪失效且所述无线信号跟踪有效时,根据所述无线信号检测方向及所述视觉跟踪的图像采集方向,确定所述图像采集的当前采集参数;
解析基于所述当前采集参数采集的当前跟踪图像,基于所述目标对象的第一图像特征,从所述当前跟踪图像中提取出备选图像区域;
提取所述备选图像区域内图形对象的备选图像特征;
计算各所述备选图像特征与所述目标对象的第二图像特征的匹配度,其中,所述第二图像特征包括所述第一图像特征;
当检测到所述匹配度大于第三阈值的所述备选图像特征时,确定所述视觉跟踪恢复有效。
所述第一图像特征可包括:跟踪的目标对象的外轮廓特征,例如,跟踪目标为人,则所述第一图像特征可为人体型的外边缘特征。所述第二图像特征可包括所述第一图像特征,但是除了所述第一图像特征还包括更多的细节特征,可以将目标对象从多个相同类型的对象中识别出来。譬如,被跟踪用户的脸部轮廓特征,被跟踪独享的肤色特征、被跟踪对象的服装及穿戴特征。
总之,所述第一图像特征和第二图像特征,可包括:轮廓特征、纹理特征、颜色特征及外形尺寸特征等。
在本实施例中当视觉跟踪的采集方向调整之后,会再次进行跟踪图像的采集,解析采集的跟踪图像,利用第一图像特征从跟踪图像中圈出一个个需要进一步识别的备选图像区域。例如,目标对象是被跟踪的人,在跟踪图像中可能包括:多个人的图形,此处之外还可包括:小推车以及货柜的图形,则此时,可以基于人的外轮廓特征,将跟踪图像中显示有人的图形的区域分割出来作为所述备选图像区域,进行进一步的备选图像区域内图像特征提取,得到与第二图像特征匹配的特征值,确定出匹配度。这里的匹配度可包括:匹配成功的特征个数在所有待匹配的第二图像特征中的比例等参数。
若所述匹配度大于第四阈值,则可认为在当前的跟踪图像中检测到目标对象,可以认为当前跟踪恢复有效。
图3为一个跟踪图像,图4为图3所示的跟踪图像标注出备选图像区域的示意图。在图4中虚线方框圈住的区域即为所述备选图像区域。
在一些实施例中,所述方法还包括:
当所述视觉跟踪有效且所述无线信号跟踪失效时,利用基于所述第一跟踪信息确定出的信号检测方向检测的第二跟踪信息,重新确定所述目标对象相对于跟踪设备的相对位置参数,其中,所述相对位置参数包括:相 对距离和/或相对角度;
判断所述相对距离是否小于第四阈值,和/或,判断所述相对角度是否小于第五阈值;
当所述相对距离小于所述第六阈值,和/或,所述相对角度小于所述第七阈值时,确定所述无线信号跟踪恢复有效。
在一些实施例中,所述方法还包括:
当所述视觉跟踪有效且所述无线信号跟踪失效时,利用基于所述第一跟踪信息确定出的信号检测方向检测的第二跟踪信息,重新确定所述目标对象相对于跟踪设备的相对位置参数,其中,所述相对位置参数包括:相对距离和/或相对角度;
判断所述相对距离是否小于第五阈值,和/或,判断所述相对角度是否小于第六阈值;
当所述相对距离小于所述第五阈值,和/或,所述相对角度小于所述第六阈值时,确定所述无线信号跟踪恢复有效。
在本实施例中直接根据信号检测方向调整后的第二跟踪信息,确定出目标对象与跟踪设备的相对位置参数,在本实施例中若确定出相对距离小于第五阈值,和/或角度小于第六阈值,则认为跟踪恢复有效,否则需要继续调整信号检测方向,以确保跟踪有效。
如图5所示,本实施例提供一种目标跟踪设备,包括:
采集单元110,配置为采集跟踪图像;
第一获取单元120,配置为基于所述跟踪图像,获取对目标对象的视觉跟踪的第一跟踪信息;
检测单元130,配置为检测进行所述目标跟踪的无线信号;
第二获取单元140,配置为解析所述无线信号,获取对所述目标对象进行无线信号跟踪的第二跟踪信息;
确定单元150,配置为结合所述第一跟踪信息及所述第二跟踪信息,确定所述目标对象的最终跟踪信息。
在本实施例中所述跟踪设备可为地面移动机器人或飞行机器人等各种可移动的电子设备。
在本实施例中所述采集单元110可包括:能够进行图像采集的采集模组,所述采集模组可包括:一个或多个进行图像采集的摄像头,该摄像头可包括:彩色摄像头、深度摄像头以及三维图像采集的立体摄像头。深度摄像头可以通过深度图像的采集,会自动过滤掉色彩信息从而实现目标对象与色彩相关的保密信息的隐藏。
所述检测单元130可对应于能够接收无线信号的天线等。
所述第一获取单元120、第二获取单元140及确定单元150可对应于所述跟踪设备内的处理器等。所述处理器可包括:中央处理器、微处理器、数字信号处理器、可编程阵列或应用处理器等。
所述处理器通过计算机程序的执行,可以得到所述第一跟踪信息及第二跟踪信息,并最终确定出目标跟踪的最终跟踪信息。
本实施例提供的跟踪设备的具有目标跟踪丢失现象少,跟踪成功率高。
在一些实施例中,所述确定单元150,配置为结合所述第一跟踪信息及所述第二跟踪信息,判断所述视觉跟踪和所述无线信号跟踪是否有出现目标丢失;根据所述判断的判断结果,确定所述最终跟踪信息。
在一些实施例中,所述确定单元150,配置为计算所述第一跟踪信息和所述第二跟踪信息的近似度;将计算所得近似度的取值与预设的第一阈值进行比较,在所述比较的结果表征所述近似度未达到预设的第一近似度要求时,确定所述视觉跟踪和所述无线信号跟踪的至少一个出现目标丢失。
所述确定单元150可对应于计算器及具有计算功能的处理器,能够计算所述近似度。
在一些实施例中,所述确定单元150,配置为基于所述第一跟踪信息,计算所述目标对象在第一预设时间内相对于所述跟踪设备的第一平均距离和第一平均角度;基于所述第二跟踪信息,计算所述目标设备在第一预设时间内相对于所述跟踪设备的第二平均距离和第二平均角度;基于所述第一平均距离、所述第二平均距离、所述第一平均角度及所述第二平均角度,计算所述近似度。
在一些实施例中,所述确定单元150,配置为将计算所得近似度的取值与预设的第一阈值进行比较,在所述比较的结果表征所述近似度未达到预设的第一近似度要求时,直接确定所述视觉跟踪无效且所述无线信号跟踪有效;或者,判断在第二预设时间内的所述第二跟踪信息是否存在连续性变化,如果判断结果为是,则确定所述无线信号跟踪有效且所述视觉跟踪无效,如果判断结果为否,则确定所述视觉跟踪有效且所述无线信号跟踪无效。
在一些实施例中,所述确定单元150,还配置为将计算所得近似度的取值与预设的第一阈值进行比较,在所述比较的结果表征所述近似度未达到预设的第一近似度要求时,根据当前时刻的所述第二跟踪信息,获取所述目标对象相对于所述跟踪设备的第一当前相对位置参数,并确定所述第一当前相对位置参数是否满足第一预设条件,当所述第一当前相对位置参数满足第一预设条件,则确定所述视觉跟踪失效且所述无线信号跟踪有效;和/或,当所述近似度的取值小于预设的第一阈值时,根据当前时刻的所述第二跟踪信息,获取所述目标对象相对于所述跟踪设备的第二当前相对位置参数,并确定所述第二当前相对位置参数是否满足第二预设条件,当所述第二当前相对位置参数满足第二预设条件,则确定所述视觉跟踪有效且所述无线信号跟踪失效。
在一些实施例中,所述确定单元150,配置为计算所述第一当前相对位 置参数与第一历史相对位置参数之间的第一距离,其中,所述第一历史相对位置参数为基于所述当前时刻以前的一个或多个历史时刻检测的第二跟踪信息确定的所述目标对象相对于所述跟踪设备的相对位置参数,或所述第一历史相对位置参数为基于所述当前时刻以前的至少两个历史时刻检测的第二跟踪信息确定的所述目标对象相对于所述跟踪设备的相对位置参数的平均值。
当所述第一距离小于第一特定值时,确定所述第一当前相对位置参数满足所述第一预设条件。
所述确定单元150,还配置为计算所述第二当前相对位置参数与第二历史相对位置参数之间的第二距离,其中,所述第二历史相对位置参数为基于所述当前时刻以前的一个或多个历史时刻检测的第一跟踪信息确定的所述目标对象相对于所述跟踪设备的相对位置参数,或所述第二历史相对位置参数为基于所述当前时刻以前的至少两个历史时刻检测的第一跟踪信息确定的所述目标对象相对于所述跟踪设备的相对位置参数的平均值;当所述第二距离小于第二特定值时,确定所述第二当前相对位置参数满足所述第二预设条件。
在一些实施例中,所述确定单元150,配置为当所述信号检测方向与采集方向一致时,维持所述跟踪设备的当前采集方向,或维持所述跟踪设备的当前信号检测方向.
此外,所述确定单元,还配置为当所述信号检测方向与采集方向不一致时,基于所述第二跟踪信息获得所述目标对象相对于所述跟踪设备的相对位置参数,基于所述相对位置参数及所述信号检测方向,确定所述当前采集方向,或,基于所述第一跟踪信息获得所述目标对象相对于所述跟踪设备的相对位置参数,基于所述相对位置参数及所述采集方向,确定所述信号检测方向。
所述确定单元150会根据当前的采集方向和信号检测方向,确定出是否需要调整采集方向和/或信号检测方向,从而以恢复失效的跟踪方式的跟踪有效。
所述确定单元150,配置为执行以下至少之一:
当所述视觉跟踪失效且所述无线信号跟踪有效时,确定所述第二跟踪信息为所述最终跟踪信息;
当所述视觉跟踪有效且所述无线信号跟踪无效时,确定所述第一跟踪信息为所述最终跟踪信息;
当所述视觉跟踪和所述无线信号跟踪均有效时,确定所述第一跟踪信息为所述最终跟踪信息;
当所述视觉跟踪和所述无线信号跟踪均有效时,以所述第一跟踪信息和所述第二跟踪信息为预设函数关系的因变量计算所述最终跟踪信息。
在一些实施例中,所述确定单元150,配置为在所述视觉跟踪失效且在基于所述第二跟踪信息跟踪目标对象的过程中,获得基于视觉跟踪获得的第一跟踪信息,并计算所述第一跟踪信息和所述第二跟踪信息的近似度,将计算所得近似度的取值与预设的第二阈值进行比较,在所述比较的结果表征所述近似度达到预设的第二近似度要求时,确定所述视觉跟踪恢复为有效。
在一些实施例中,所述确定单元150,还配置为在所述视觉跟踪有效且所述无线信号跟踪无效时,在基于所述第一跟踪信息跟踪目标对象的过程中,获得基于无线信号跟踪获得的第二跟踪信息,并计算所述第一跟踪信息和所述第二跟踪信息的近似度,将计算所得近似度的取值与预设的第三阈值进行比较,在所述比较的结果表征所述近似度达到预设的第三近似度要求时,确定所述无线信号跟踪恢复为有效。
例如,所述确定单元150,可配置为当所述视觉跟踪失效且所述无线信 号跟踪有效时,基于所述无线信号跟踪的信号检测方向及所述视觉跟踪采集所述跟踪图像的采集方向,确定当前采集方向,以使所述视觉跟踪采集到包括所述目标对象的跟踪图像,或,当所述视觉跟踪有效且所述无线信号跟踪失效时,基于所述无线信号跟踪的信号检测方向及所述视觉跟踪采集所述跟踪图像的采集方向、确定所述无线信号跟踪的信号检测方向;然后通过采集方向或信号检测方向的调整使得对应的跟踪方式跟踪有效,然后通过近似度的计算验证是否恢复有效。
在本实施例中所述确定单元150,同样可对应于处理器或处理电路,通过调整所述采集方向或信号检测方向,可以使失效的跟踪方式重新恢复有效。
在一些实施例中,所述确定单元150,配置为当所述信号检测方向与采集方向一致时,维持所述跟踪设备的当前采集方向,或,当所述信号检测方向与采集方向一致时,维持所述跟踪设备的当前信号检测方向;和/或,所述恢复单元,还具体用于当所述信号检测方向与采集方向不一致时,基于所述第二跟踪信息获得所述目标对象相对于所述跟踪设备的相对位置参数,基于所述相对位置参数及所述信号检测方向,确定所述当前采集方向;或,当所述信号检测方向与采集方向不一致时,基于所述第一跟踪信息获得所述目标对象相对于所述跟踪设备的相对位置参数,基于所述相对位置参数及所述采集方向,确定所述信号检测方向。
除了验证是否恢复有效的技术方案,在一些实施例中,所述跟踪设备还包括:
第一验证单元,配置为当所述视觉跟踪失效且所述无线信号跟踪有效时,根据所述无线信号检测方向及所述视觉跟踪的图像采集方向,确定所述图像采集的当前采集参数;解析基于所述当前采集参数采集的当前跟踪图像,基于所述目标对象的第一图像特征,从所述当前跟踪图像中提取出 备选图像区域;提取所述备选图像区域内图形对象的备选图像特征;计算各所述备选图像特征与所述目标对象的第二图像特征的匹配度,其中,所述第二图形特征包括所述第一图像特征;当检测到所述匹配度大于第四阈值的所述备选图像特征时,确定所述视觉跟踪恢复有效。
在一些实施例中,所述跟踪设备,还包括:第二验证单元,配置为当所述视觉跟踪有效且所述无线信号跟踪失效时,利用基于所述第一跟踪信息确定出的信号检测方向检测的第二跟踪信息,重新确定所述目标对象相对于跟踪设备的相对位置参数,其中,所述相对位置参数包括:相对距离和/或相对角度;判断所述相对距离是否小于第五阈值,和/或,判断所述相对角度是否小于第六阈值;当所述相对距离小于第七阈值,和/或,所述相对角度小于第八阈值时,确定所述无线信号跟踪恢复有效。
所述第一验证单元和第二验证单元都可以对应于处理器等,根据检测的跟踪信息,通过第一距离和/或第二距离的计算,验证对应跟踪方式的跟踪是否有效或恢复有效。
如图6所示,本实施例提供一种跟踪设备,包括:
图像采集模组210,进一步地采集跟踪图像;
天线模组220,进一步地检测无线信号;
存储器230,进一步地存储有计算机程序;
处理器240,分别与所述图像采集模组210、天线模组220及存储器230连接,用于通过执行所述计算机程序执行前述任意一个实施例提供的目标跟踪方法。
所述图像采集模组210,可包括一个或多个摄像头。
所述天线模组220可包括一根或多根可以接收无线信号的天线。
所述存储器230可包括各种存储介质,例如,非瞬间存储介质,可用于存储所述计算机程序。
所述处理器240可为中央处理器、微处理器、应用处理器、数字信号处理器或可编程阵列等,可通过总线250与所述图像采集模组210、天线模组220及存储器230连接。所述总线可为集成电路(IIC)总线等。
所述处理器240可以通过计算机程序的执行,可以执行前述任意一个技术方案提供的目标跟踪方法,具体可如图1和/或图2所示的方法。
本申请实施例还提供一种计算机存储介质,所述计算机存储介质存储有计算机程序;所述计算机程序被处理器执行后,能够执行前述任意一个实施例提供的目标跟踪方法。
这里的计算机存储介质可为非瞬间存储介质,具体可为光盘、闪存或移动硬盘等各种类型的存储介质。
以下结合上述实施例提供具体示例:
示例1:
跟踪目标对象及重找回跟踪对象:指的是在长时间跟踪一个目标对象的过程中,由于各种干扰导致的跟踪器(tracker)丢失设定的跟踪目标对象或者被干扰物体带走,需要重新找到初始设定的跟踪目标对象,并恢复正常跟踪。例如,在视觉跟踪的过程中,被跟踪的目标对象与跟踪设备之间插入了干扰物,导致采集的目标图像从跟踪图像上消失。或者,目标对象进入了一个强光区,光线过强,跟踪设备还没有能够及时的根据当前的环境亮度调整自身的采集亮度和采集对比度等的采集参数,从而导致采集跟踪图像出现曝光过度,进而导致无法从跟踪图像上成功提取出目标对象的现象。
基于视觉跟踪可以根据初始帧画面中定义的目标对象模板,训练模型用于后续视频中该目标对象的跟踪,并在跟踪过程不断更新模型,以达到适应目标对象物体姿态变化,以及克服复杂背景干扰的目的。这里的目标对象模板可包括:目标对象的一个或多个图像特征;或目标对象 成像之后的特征提取模型等。
由于无需离线训练,该类技术具有很高的通用性,可以对用户指定的任何物体进行跟踪;但是,对于长时间的跟踪算法的一个主要问题是跟踪过程中无法准确地判断目标对象是否丢失,以及目标对象丢失后准确地找回初始设定的跟踪目标对象。而且,纯基于视觉跟踪的目标对象重找回容易受到相似目标对象、环境的干扰,难以形成鲁棒的长时间跟踪系统。
在本示例中在视觉跟踪的基础上,引入了一种UWB跟踪,以辅助视觉跟踪,以减少目标对象被跟丢的现象。
UWB是一种无载波通信技术,利用纳秒至微微秒级的非正弦波窄脉冲传输数据,可利用其亚纳秒级超窄脉冲来做近距离精确室内定位或目标跟踪,具有抗干扰性能强、无载波、设备发射功率低及跟踪精度高等特点,跟踪精度可以高达10厘米左右。
基于UWB跟踪和视觉跟踪结合的目标跟踪方法,用于解决跟踪过程中的目标对象丢失和重找回问题。
1)被跟随的目标对象需要携带UWB信标,开始跟踪后,由视觉跟踪的获得视觉跟踪信息,控制机器人对目标对象进行视觉跟踪,控制信号包含每一时刻t距离的目标对象距离
Figure PCTCN2018088020-appb-000006
和目标对象与机器人的相对角度
Figure PCTCN2018088020-appb-000007
UWB信标为发送UWB信号的发射设备。
2)在视觉跟踪的过程中,每一时刻同样采集UWB信标距离机器人的距离
Figure PCTCN2018088020-appb-000008
和相对角度
Figure PCTCN2018088020-appb-000009
3)每一时刻校验UWB跟踪和视觉跟踪给出的距离和角度,并根据校验结果分别进行处理,具体的步骤如下:
分别计算过去N个时刻的UWB和视觉算法获得的平均距离和平均角度
Figure PCTCN2018088020-appb-000010
所述
Figure PCTCN2018088020-appb-000011
为UWB跟踪的平均距离;所 述
Figure PCTCN2018088020-appb-000012
为视觉跟踪的平均距离;所述
Figure PCTCN2018088020-appb-000013
为视觉跟踪的平均角度;所述
Figure PCTCN2018088020-appb-000014
为UWB跟踪的平均角度。
计算两组距离和角度的一致性S conf,计算方法如下:
Figure PCTCN2018088020-appb-000015
上式子计算得到的S conf实际上是由UWB和视觉分别计算出的目标对象在空间中的位置距离,对于该距离设置阈值
Figure PCTCN2018088020-appb-000016
来判断两组数据是否一致,当S conf大于阈值
Figure PCTCN2018088020-appb-000017
时,认为目标对象已经跟错;这里的一致性校验实际上能够涵盖其中一组跟踪数据无效的情况,如视觉跟踪中目标对象一旦丢失,此时获得的距离和角度将会保持为0。此时,计算得到的S conf会很大;这样设计的一致性校验,实际上能够很好的解决纯视觉跟踪中目标对象准确判丢的问题,对于单一视觉跟踪的系统而言,大多数基于在线更新的模板,目标对象跟错后,即被错误地目标对象带走,简单的增加基于视觉的信息难以解决这种跟错问题,而过于复杂的计算也难以在实时系统中应用,而这里提出的位置校验利用了额外的传感器信息,在几乎不增加计算复杂度的情况下,能够很准确地判断出这种情况;
认为跟踪目标对象出现偏差时,首先校验两组跟踪数据的有效性,如果UWB数据有效(连续帧存在变化),则启用UWB跟踪,使用
Figure PCTCN2018088020-appb-000018
Figure PCTCN2018088020-appb-000019
作为跟踪信息进行验证,持续至满足条件
Figure PCTCN2018088020-appb-000020
时,开启视觉跟踪对跟随目标对象的验证。此处,针对角度设置阈值的意义在于,当机器人正确、稳定地跟踪目标对象时,其姿态应正对被跟踪目标对象,此时,获得的夹角应接近于0。所述
Figure PCTCN2018088020-appb-000021
通常是小于所述
Figure PCTCN2018088020-appb-000022
在本示例中相当于在视觉跟踪失效之后,重新验证视觉跟踪是否恢复有效时,为了减少验证次数,会在基于当前有效的UWB跟踪的跟踪角度小于特定值时, 表明目标对象与跟踪设备的夹角很小,目标对象出现跟踪图像中的概率较高,从而可以减少跟踪恢复的验证次数。
当UWB稳定跟踪目标对象后,启用视觉跟踪方法进行验证,使用基于视觉的目标对象再识别方法,验证跟踪的目标对象是否与之前的跟踪目标对象一致,确认一致后,恢复正常的视觉算法跟踪状态;
以下结合图7给出视觉跟踪及UWB跟踪结合的目标跟踪方法,包括:
步骤S1:选定目标对象并确定目标对象已携带UWB信标;
步骤S2:进行视觉跟踪和UWB跟踪;
步骤S3:获得视觉跟踪及UWB跟踪的跟踪距离及跟踪角度校验;
步骤S4:基于校验的结果,判断跟踪的是否为同一个对象,若是返回步骤S2,若否进入步骤S5:
步骤S5:保持UWB跟踪;
步骤S6:基于UWB跟踪的跟踪距离及跟踪角度,对视觉跟踪进行验证;
步骤S7:判断跟踪的是否为同一个对象,若是进入步骤S2,若否返回步骤S5。
示例2:
步骤1:地面机器人跟踪行人。
步骤2:地面机器人搭载UWB模块和深度摄像头以及RGB摄像头;
步骤3:被跟踪的目标对象搭载UWB信标;
步骤4:利用红绿蓝(RGB)摄像头采集的视频帧进行跟踪,根据目标对象在图像中的位置计算目标对象相对机器人的角度,深度摄像头获得的深度信息用于计算目标对象和机器人的距离;
步骤4:视觉跟踪利用视觉跟踪算法或跟踪模型也包含了目标对象 跟踪过程中的各种状态,如目标对象丢失、目标对象重新找回等,当目标对象丢失时,给出的距离和角度均为0,目标对象重新找回用于短时间UWB跟踪后的目标对象验证。
在本示例中,可利用UWB跟踪和视觉跟踪结合,解决了纯视觉跟踪系统中,不能准确判断目标对象是否丢失的问题;使用UWB信号短时间跟踪对视觉跟踪进行补充,解决长时间跟踪过程中,目标对象重找回难的问题。
在本申请所提供的几个实施例中,应该理解到,所揭露的设备和方法,可以通过其它的方式实现。以上所描述的设备实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,如:多个单元或组件可以结合,或可以集成到另一个系统,或一些特征可以忽略,或不执行。另外,所显示或讨论的各组成部分相互之间的耦合、或直接耦合、或通信连接可以是通过一些接口,设备或单元的间接耦合或通信连接,可以是电性的、机械的或其它形式的。
上述作为分离部件说明的单元可以是、或也可以不是物理上分开的,作为单元显示的部件可以是、或也可以不是物理单元,即可以位于一个地方,也可以分布到多个网络单元上;可以根据实际的需要选择其中的部分或全部单元来实现本实施例方案的目的。
另外,在本申请各实施例中的各功能单元可以全部集成在一个处理模块中,也可以是各单元分别单独作为一个单元,也可以两个或两个以上单元集成在一个单元中;上述集成的单元既可以采用硬件的形式实现,也可以采用硬件加软件功能单元的形式实现。
本领域普通技术人员可以理解:实现上述方法实施例的全部或部分步骤可以通过程序指令相关的硬件来完成,前述的程序可以存储于一计 算机可读取存储介质中,该程序在执行时,执行包括上述方法实施例的步骤。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。
工业实用性
本申请实施例提供的技术方案中,在进行目标对象的跟踪时同时会采用视觉跟踪及无线信号跟踪;如此可以利用两种跟踪方式进行精准跟踪,以减少目标对象跟丢的现象,具有积极的工业效果。与此同时,本申请实施例提供的技术方案具有实现简便的特点,可以在工业上广泛运用,具有可工业应用前景大的特点。

Claims (19)

  1. 一种目标跟踪方法,应用于目标跟踪设备,包括:
    采集跟踪图像;
    基于所述跟踪图像,获取对目标对象进行视觉跟踪的第一跟踪信息;
    检测进行所述目标跟踪的无线信号;
    解析所述无线信号,获取对所述目标对象进行无线信号跟踪的第二跟踪信息;
    结合所述第一跟踪信息及所述第二跟踪信息,确定所述目标对象的最终跟踪信息。
  2. 根据权利要求1所述的方法,其中,
    所述结合所述第一跟踪信息及所述第二跟踪信息,确定所述目标对象的最终跟踪信息,包括:
    结合所述第一跟踪信息及所述第二跟踪信息,判断所述视觉跟踪和所述无线信号跟踪是否有出现目标丢失;
    根据所述判断的判断结果,确定所述最终跟踪信息。
  3. 根据权利要求2所述的方法,其中,
    所述结合所述第一跟踪信息及所述第二跟踪信息,判断所述视觉跟踪和所述无线信号跟踪是否有出现目标丢失,包括:
    计算所述第一跟踪信息和所述第二跟踪信息的近似度;
    所述根据所述判断的判断结果,确定所述最终跟踪信息,包括:
    将计算所得近似度的取值与预设的第一阈值进行比较,在所述比较的结果表征所述近似度未达到预设的第一近似度要求时,确定所述视觉跟踪和所述无线信号跟踪的至少一个出现目标丢失。
  4. 根据权利要求3所述的方法,其中,
    所述计算所述第一跟踪信息和所述第二跟踪信息的近似度,包括:
    基于所述第一跟踪信息,计算所述目标对象在第一预设时间内相对于所述目标跟踪设备的第一平均距离和第一平均角度;
    基于所述第二跟踪信息,计算所述目标设备在第一预设时间内相对于所述目标跟踪设备的第二平均距离和第二平均角度;
    基于所述第一平均距离、所述第二平均距离、所述第一平均角度及所述第二平均角度,计算所述近似度。
  5. 根据权利要求3或4所述的方法,其中,
    所述将计算所得近似度的取值与预设的第一阈值进行比较,在所述比较的结果表征所述近似度未达到预设的第一近似度要求时,确定所述视觉跟踪和所述无线信号跟踪的至少一个出现目标丢失,还包括:
    直接确定所述视觉跟踪无效且所述无线信号跟踪有效;
    或者,
    判断在第二预设时间内的所述第二跟踪信息是否存在连续性变化,如果判断结果为是,则确定所述无线信号跟踪有效且所述视觉跟踪无效,如果判断结果为否,则确定所述视觉跟踪有效且所述无线信号跟踪无效。
  6. 根据权利要求2所述的方法,其中,
    所述方法还包括以下至少之一:
    当所述判断结果表征视觉跟踪失效且无线信号跟踪有效时,基于所述无线信号跟踪的信号检测方向及所述视觉跟踪采集所述跟踪图像的采集方向,确定所述跟踪图像的当前采集方向,以使所述视觉跟踪采集到包括所述目标对象的跟踪图像;
    当所述判断结果表征视觉跟踪有效且无线信号跟踪失效时,基于所述无线信号跟踪的信号检测方向及所述视觉跟踪采集所述跟踪图像的采集方向、确定所述无线信号跟踪的信号检测方向。
  7. 根据权利要求6所述的方法,其中,
    当所述信号检测方向与采集方向一致时,所述确定当前采集方向,包括:
    维持所述目标跟踪设备的当前采集方向;
    当所述信号检测方向与采集方向一致时,所述确定所述无线信号跟踪的信号检测方向,包括:
    维持所述目标跟踪设备的当前信号检测方向;
    当所述信号检测方向与采集方向不一致时,所述确定当前采集方向,包括:
    基于所述第二跟踪信息获得所述目标对象相对于所述目标跟踪设备的相对位置参数,基于所述相对位置参数及所述信号检测方向,确定所述当前采集方向;
    当所述信号检测方向与采集方向不一致时,确定所述无线信号跟踪的信号检测方向,包括:
    基于所述第一跟踪信息获得所述目标对象相对于所述目标跟踪设备的相对位置参数,基于所述相对位置参数及所述采集方向,确定所述信号检测方向。
  8. 根据权利要求2至4及6至7任一项所述的方法,其中,
    所述根据所述判断结果,确定所述最终跟踪信息,包括以下至少其中之一:
    当所述视觉跟踪失效且所述无线信号跟踪有效时,确定所述第二跟踪信息为所述最终跟踪信息;
    当所述视觉跟踪有效且所述无线信号跟踪无效时,确定所述第一跟踪信息为所述最终跟踪信息;
    当所述视觉跟踪和所述无线信号跟踪均有效时,确定所述第一跟踪信 息为所述最终跟踪信息;
    当所述视觉跟踪和所述无线信号跟踪均有效时,以所述第一跟踪信息和所述第二跟踪信息为预设函数关系的因变量计算所述最终跟踪信息。
  9. 根据权利要求8所述的方法,其中,
    在所述视觉跟踪失效且无线信号跟踪有效时,确定第二跟踪信息为最终跟踪信息之后,所述方法还包括:
    在基于所述第二跟踪信息跟踪目标对象的过程中,获得基于视觉跟踪获得的第一跟踪信息,并计算所述第一跟踪信息和所述第二跟踪信息的近似度,将计算所得近似度的取值与预设的第二阈值进行比较,在所述比较的结果表征所述近似度达到预设的第二近似度要求时,确定所述视觉跟踪恢复为有效。
  10. 根据权利要求8所述的方法,其中,
    在所述视觉跟踪有效且所述无线信号跟踪无效时,确定所述第一跟踪信息为所述最终跟踪信息之后,所述方法还包括:
    在基于所述第一跟踪信息跟踪目标对象的过程中,获得基于无线信号跟踪获得的第二跟踪信息,并计算所述第一跟踪信息和所述第二跟踪信息的近似度,将计算所得近似度的取值与预设的第三阈值进行比较,在所述比较的结果表征所述近似度达到预设的第三近似度要求时,确定所述无线信号跟踪恢复为有效。
  11. 一种目标跟踪设备,其中,包括:
    采集单元,配置为采集跟踪图像;
    第一获取单元,配置为基于所述跟踪图像,获取对目标对象进行视觉跟踪的第一跟踪信息;
    检测单元,配置为检测进行所述目标跟踪的无线信号;
    第二获取单元,配置为解析所述无线信号,获取对所述目标对象进行 无线信号跟踪的第二跟踪信息;
    确定单元,配置为结合所述第一跟踪信息及所述第二跟踪信息,确定所述目标对象的最终跟踪信息。
  12. 根据权利要求11所述的目标跟踪设备,其中,
    所述确定单元,配置为结合所述第一跟踪信息及所述第二跟踪信息,判断所述视觉跟踪和所述无线信号跟踪是否有出现目标丢失;根据所述判断的判断结果,确定所述最终跟踪信息。
  13. 根据权利要求12所述的目标跟踪设备,其中,
    所述确定单元,配置为计算所述第一跟踪信息和所述第二跟踪信息的近似度;将计算所得近似度的取值与预设的第一阈值进行比较,在所述比较的结果表征所述近似度未达到预设的第一近似度要求时,确定所述视觉跟踪和所述无线信号跟踪的至少一个出现目标丢失。
  14. 根据权利要求13所述的目标跟踪设备,其中,
    所述确定单元,配置为基于所述第一跟踪信息,计算所述目标对象在第一预设时间内相对于所述跟踪设备的第一平均距离和第一平均角度;基于所述第二跟踪信息,计算所述目标设备在第一预设时间内相对于所述跟踪设备的第二平均距离和第二平均角度;基于所述第一平均距离、所述第二平均距离、所述第一平均角度及所述第二平均角度,计算所述近似度。
  15. 根据权利要求13或14所述的目标跟踪设备,其中,
    所述确定单元,配置为将计算所得近似度的取值与预设的第一阈值进行比较,在所述比较的结果表征所述近似度未达到预设的第一近似度要求时,直接确定所述视觉跟踪无效且所述无线信号跟踪有效;或者,判断在第二预设时间内的所述第二跟踪信息是否存在连续性变化,如果判断结果为是,则确定所述无线信号跟踪有效且所述视觉跟踪无效,如果判断结果为否,则确定所述视觉跟踪有效且所述无线信号跟踪无效。
  16. 根据权利要求12、13或14所述的目标跟踪设备,其中,
    所述确定单元,配置为执行以下至少之一:
    当所述视觉跟踪失效且所述无线信号跟踪有效时,确定所述第二跟踪信息为所述最终跟踪信息;
    当所述视觉跟踪有效且所述无线信号跟踪无效时,确定所述第一跟踪信息为所述最终跟踪信息;
    当所述视觉跟踪和所述无线信号跟踪均有效时,确定所述第一跟踪信息为所述最终跟踪信息;
    当所述视觉跟踪和所述无线信号跟踪均有效时,以所述第一跟踪信息和所述第二跟踪信息为预设函数关系的因变量计算所述最终跟踪信息。
  17. 根据权利要求16所述的目标跟踪设备,其中,
    所述确定单元,配置为在所述视觉跟踪失效且在基于所述第二跟踪信息跟踪目标对象的过程中,获得基于视觉跟踪获得的第一跟踪信息,并计算所述第一跟踪信息和所述第二跟踪信息的近似度,将计算所得近似度的取值与预设的第二阈值进行比较,在所述比较的结果表征所述近似度达到预设的第二近似度要求时,确定所述视觉跟踪恢复为有效;
    在所述视觉跟踪有效且所述无线信号跟踪无效时,在基于所述第一跟踪信息跟踪目标对象的过程中,获得基于无线信号跟踪获得的第二跟踪信息,并计算所述第一跟踪信息和所述第二跟踪信息的近似度,将计算所得近似度的取值与预设的第三阈值进行比较,在所述比较的结果表征所述近似度达到预设的近似度要求时,确定所述无线信号跟踪恢复为有效。
  18. 一种目标跟踪设备,其中,包括:
    图像采集模组,配置为采集跟踪图像;
    天线模组,配置为检测无线信号;
    存储器,配置为存储有计算机程序;
    处理器,分别与所述图像采集模组、天线模组及存储器连接,配置为通过执行所述计算机程序执行权利要求1至10任一项所述的方法。
  19. 一种计算机存储介质,所述计算机存储介质存储有计算机程序;所述计算机程序被处理器执行后,能够执行权利要求1至10任一项所述的方法。
PCT/CN2018/088020 2017-05-24 2018-05-23 目标跟踪方法、目标跟踪设备及计算机存储介质 WO2018214909A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710374093.8 2017-05-24
CN201710374093.8A CN107255468B (zh) 2017-05-24 2017-05-24 目标跟踪方法、目标跟踪设备及计算机存储介质

Publications (1)

Publication Number Publication Date
WO2018214909A1 true WO2018214909A1 (zh) 2018-11-29

Family

ID=60027992

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/088020 WO2018214909A1 (zh) 2017-05-24 2018-05-23 目标跟踪方法、目标跟踪设备及计算机存储介质

Country Status (2)

Country Link
CN (1) CN107255468B (zh)
WO (1) WO2018214909A1 (zh)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107255468B (zh) * 2017-05-24 2019-11-19 纳恩博(北京)科技有限公司 目标跟踪方法、目标跟踪设备及计算机存储介质
CN108820215B (zh) * 2018-05-21 2021-10-01 南昌航空大学 一种自主寻找目标的自动空投无人机
CN108764167B (zh) * 2018-05-30 2020-09-29 上海交通大学 一种时空关联的目标重识别方法和系统
CN109156948A (zh) * 2018-08-21 2019-01-08 芜湖职业技术学院 自动跟随伞
EP3837492A1 (en) * 2018-08-21 2021-06-23 SZ DJI Technology Co., Ltd. Distance measuring method and device
CN109445465A (zh) * 2018-10-17 2019-03-08 深圳市道通智能航空技术有限公司 基于无人机的追踪方法、系统、无人机及终端
CN109460077B (zh) * 2018-11-19 2022-05-17 深圳博为教育科技有限公司 一种自动跟踪方法、自动跟踪设备及自动跟踪系统
CN109658434B (zh) * 2018-12-26 2023-06-16 成都纵横自动化技术股份有限公司 一种目标跟踪的方法及装置
CN109828596A (zh) * 2019-02-28 2019-05-31 深圳市道通智能航空技术有限公司 一种目标跟踪方法、装置和无人机
CN110418114B (zh) * 2019-08-20 2021-11-16 京东方科技集团股份有限公司 一种对象跟踪方法、装置、电子设备及存储介质
CN112922882A (zh) * 2019-12-06 2021-06-08 佛山市云米电器科技有限公司 风扇控制方法、风扇及计算机可读存储介质
CN113674309B (zh) * 2020-05-14 2024-02-20 杭州海康威视系统技术有限公司 对象追踪的方法、装置、管理平台以及存储介质
CN111862154B (zh) * 2020-07-13 2024-03-01 中移(杭州)信息技术有限公司 机器人视觉跟踪方法、装置、机器人及存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110181712A1 (en) * 2008-12-19 2011-07-28 Industrial Technology Research Institute Method and apparatus for tracking objects
CN105629196A (zh) * 2016-01-07 2016-06-01 观宇能源科技(上海)有限公司 基于计算机视觉及动态指纹的定位系统及相应方法
CN105915784A (zh) * 2016-04-01 2016-08-31 纳恩博(北京)科技有限公司 信息处理方法和装置
CN105973228A (zh) * 2016-06-28 2016-09-28 江苏环亚医用科技集团股份有限公司 一种基于单摄像头、rssi的室内目标定位系统及方法
CN106683123A (zh) * 2016-10-31 2017-05-17 纳恩博(北京)科技有限公司 一种目标跟踪方法及目标跟踪装置
CN107255468A (zh) * 2017-05-24 2017-10-17 纳恩博(北京)科技有限公司 目标跟踪方法、目标跟踪设备及计算机存储介质

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105488815B (zh) * 2015-11-26 2018-04-06 北京航空航天大学 一种支持目标尺寸变化的实时对象跟踪方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110181712A1 (en) * 2008-12-19 2011-07-28 Industrial Technology Research Institute Method and apparatus for tracking objects
CN105629196A (zh) * 2016-01-07 2016-06-01 观宇能源科技(上海)有限公司 基于计算机视觉及动态指纹的定位系统及相应方法
CN105915784A (zh) * 2016-04-01 2016-08-31 纳恩博(北京)科技有限公司 信息处理方法和装置
CN105973228A (zh) * 2016-06-28 2016-09-28 江苏环亚医用科技集团股份有限公司 一种基于单摄像头、rssi的室内目标定位系统及方法
CN106683123A (zh) * 2016-10-31 2017-05-17 纳恩博(北京)科技有限公司 一种目标跟踪方法及目标跟踪装置
CN107255468A (zh) * 2017-05-24 2017-10-17 纳恩博(北京)科技有限公司 目标跟踪方法、目标跟踪设备及计算机存储介质

Also Published As

Publication number Publication date
CN107255468A (zh) 2017-10-17
CN107255468B (zh) 2019-11-19

Similar Documents

Publication Publication Date Title
WO2018214909A1 (zh) 目标跟踪方法、目标跟踪设备及计算机存储介质
Xu et al. Eventcap: Monocular 3d capture of high-speed human motions using an event camera
EP3779772B1 (en) Trajectory tracking method and apparatus, and computer device and storage medium
KR101645722B1 (ko) 자동추적 기능을 갖는 무인항공기 및 그 제어방법
US8787663B2 (en) Tracking body parts by combined color image and depth processing
US10891473B2 (en) Method and device for use in hand gesture recognition
EP2710554B1 (en) Head pose estimation using rgbd camera
US9031327B2 (en) Information processing device, method, and program that recognizes a predetermined part of a body
WO2017133605A1 (zh) 一种人脸跟踪方法、装置和智能终端
US7751599B2 (en) Method for driving virtual facial expressions by automatically detecting facial expressions of a face image
US20160004303A1 (en) Eye gaze tracking system and method
CN102841354B (zh) 一种具有显示屏幕的电子设备的保护视力实现方法
WO2019080578A1 (zh) 3d人脸身份认证方法与装置
WO2019080580A1 (zh) 3d人脸身份认证方法与装置
US9665767B2 (en) Method and apparatus for pattern tracking
CN107608345A (zh) 一种机器人及其跟随方法和系统
US10242447B2 (en) Video processing system and method for deformation and occlusion resistant object tracking in video content
US9007481B2 (en) Information processing device and method for recognition of target objects within an image
KR101769601B1 (ko) 자동추적 기능을 갖는 무인항공기
WO2020237611A1 (zh) 图像处理方法、装置、控制终端及可移动设备
CN110569785B (zh) 一种融合跟踪技术的人脸识别方法
JP6588413B2 (ja) 監視装置および監視方法
CN105260726A (zh) 基于人脸姿态控制的交互式视频活体检测方法及其系统
US20220366570A1 (en) Object tracking device and object tracking method
CN109993116A (zh) 一种基于人体骨骼相互学习的行人再识别方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18805294

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18805294

Country of ref document: EP

Kind code of ref document: A1