CN107255468A - Method for tracking target, target following equipment and computer-readable storage medium - Google Patents

Method for tracking target, target following equipment and computer-readable storage medium Download PDF

Info

Publication number
CN107255468A
CN107255468A CN201710374093.8A CN201710374093A CN107255468A CN 107255468 A CN107255468 A CN 107255468A CN 201710374093 A CN201710374093 A CN 201710374093A CN 107255468 A CN107255468 A CN 107255468A
Authority
CN
China
Prior art keywords
tracking
tracking information
wireless signal
vision
approximation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710374093.8A
Other languages
Chinese (zh)
Other versions
CN107255468B (en
Inventor
唐矗
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ninebot Beijing Technology Co Ltd
Original Assignee
Ninebot Beijing Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ninebot Beijing Technology Co Ltd filed Critical Ninebot Beijing Technology Co Ltd
Priority to CN201710374093.8A priority Critical patent/CN107255468B/en
Publication of CN107255468A publication Critical patent/CN107255468A/en
Priority to PCT/CN2018/088020 priority patent/WO2018214909A1/en
Application granted granted Critical
Publication of CN107255468B publication Critical patent/CN107255468B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/02Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • G01S13/72Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras

Abstract

The embodiment of the invention discloses a kind of method for tracking target, target following equipment and computer-readable storage medium, the method for tracking target applied to target following equipment includes:Gather tracing figure picture;Based on the tracing figure picture, the first tracking information of the vision tracking to destination object is obtained;Detection carries out the wireless signal of the target following;The wireless signal is parsed, the second tracking information that wireless signal tracking is carried out to the destination object is obtained;With reference to first tracking information and second tracking information, the final tracking information of the destination object is determined.

Description

Method for tracking target, target following equipment and computer-readable storage medium
Technical field
The present invention relates to electronic technology field, more particularly to a kind of method for tracking target, target following equipment and computer Storage medium.
Background technology
With the development of electronic technology in order to provide carry-on service, tracking equipment passes through IMAQ using vision system etc. Carry out target following;But in the prior art, it will usually there are a variety of disturbing factors, one side disturbing factor is easily caused mesh Mark is by with losing;On the other hand, once disturbing factor causes tracking target to be lost, it is difficult to give tracking target for change again.
The content of the invention
In view of this, the embodiment of the present invention expects that providing a kind of method for tracking target, target following equipment and computer deposits Storage media, to solve the high and/or give the problem of destination object is difficult for change again with losing probability of destination object.
To reach above-mentioned purpose, the technical proposal of the invention is realized in this way:
First aspect of the embodiment of the present invention provides a kind of method for tracking target, applied to target following equipment, including:
Gather tracing figure picture;
Based on the tracing figure picture, the first tracking information of the vision tracking to destination object is obtained;
Detection carries out the wireless signal of the target following;
The wireless signal is parsed, the second tracking information that wireless signal tracking is carried out to the destination object is obtained;
With reference to first tracking information and second tracking information, the final tracking letter of the destination object is determined Breath.
Second aspect of the embodiment of the present invention provides a kind of target following equipment, including:
Collecting unit, for gathering tracing figure picture;
First acquisition unit, for based on the tracing figure picture, obtaining the first tracking of the vision tracking to destination object Information;
Detection unit, the wireless signal of the target following is carried out for detecting;
Second acquisition unit, for parsing the wireless signal, obtains and carries out wireless signal tracking to the destination object The second tracking information;
Determining unit, for reference to first tracking information and second tracking information, determining the destination object Final tracking information.
The third aspect of the embodiment of the present invention provides a kind of target following equipment, including:
IMAQ module, for gathering tracing figure picture;
Antenna modules, for detected wireless signals;
Memory, for the computer program that is stored with;
Processor, is connected with described image collection module, antenna modules and memory respectively, by by perform it is described based on Calculation machine program performs the method for tracking target that foregoing any one or more technical schemes are provided.
Fourth aspect of the embodiment of the present invention provides a kind of computer-readable storage medium, and the computer-readable storage medium is stored with meter Calculation machine program;After the computer program is executed by processor, it is able to carry out foregoing any one or more technical schemes and provides Method for tracking target.
Method for tracking target provided in an embodiment of the present invention, target following equipment and computer-readable storage medium,
First, target following equipment can be tracked using two kinds of tracking modes to same destination object, so when wherein During one tracking failure, as long as another tracking is effective, effective tracking to destination object is maintained for, so as to solve single The problem of tracking Loss Rate is high caused by tracking mode, improves tracking success rate;
Secondly, when a kind of mode tracks failure, another tracking mode can effectively, then utilize and track effective track side The tracking information of formula so that the tracking mode of tracking failure is returned to effectively, giving for change again to destination object is realized again, again Recover the tracking mode of failure, so as to reduce the difficulty that destination object is given for change again;
Again, target following equipment tracking mode used at the same time in the present embodiment, one kind is vision tracking, by with The collection and analysis of track image obtain tracking information, and another wireless signal tracking obtains tracking letter by wireless signal detection Breath, both tracking mode differences are big, have feature and applicable tracking scene, realize the mutual auxiliary of tracking mode, two The probability for planting tracking mode failure simultaneously is low, so as to again reduce destination object with the probability lost, improves track into again Power.
Brief description of the drawings
Fig. 1 is the schematic flow sheet of the first method for tracking target provided in an embodiment of the present invention;
Fig. 2 is the schematic flow sheet of second of method for tracking target provided in an embodiment of the present invention;
Fig. 3 is a kind of display effect schematic diagram of tracing figure picture provided in an embodiment of the present invention;
Fig. 4 is the display renderings for the alternative tracing area that Fig. 3 includes;
Fig. 5 is a kind of structural representation of tracking equipment provided in an embodiment of the present invention;
Fig. 6 is the structural representation of another tracking equipment provided in an embodiment of the present invention;
Fig. 7 is the schematic flow sheet of the third method for tracking target provided in an embodiment of the present invention.
Embodiment
Technical scheme is further elaborated below in conjunction with Figure of description and specific embodiment.
The present embodiment provides a kind of method for tracking target, applied to target following equipment, as shown in figure 1, this method includes:
Step S110:Gather tracing figure picture;
Step S120:Based on the tracing figure picture, the first tracking information of the vision tracking to destination object is obtained;
Step S130:Detection carries out the wireless signal of the target following;
Step S140:Parse the wireless signal, obtain the destination object is carried out the second of wireless signal tracking with Track information;
Step S150:With reference to first tracking information and second tracking information, the destination object is determined most Whole tracking information.
The method for tracking target that the present embodiment is provided, can be the information processing method applied to target following equipment.It is described Target following equipment can be ground moving mobile robot, the flying robot that can also be flown in the air (such as nobody Machine).
The target following integration of equipments has collection module and wireless signal detection module in the present embodiment;The collection Module can be used for carrying out figure collection, so as to realize that vision is tracked;Wireless signal detection module can be used for by with The transmitting-receiving of the wireless signal of track destination object, carries out wireless signal tracking, can at least include the reception antenna of wireless signal, one The transmitting antenna for sending wireless signal is may also include in a little embodiments.The wireless signal, can be UWB letters in the present embodiment Number, the UWB can be Ultra Wideband (ultra wide band) abbreviation, and the UWB signal is that a kind of time span is pre- fixed length The pulse signal of degree.The predetermined length can be signal of the nanosecond to picosecond level.The pulse signal can be non-sine pulse Signal.
In certain embodiments, the wireless signal can be various radiofrequency signals.
The step S110 collections tracing figure picture, passes through the tracing figure picture to collection in the step s 120 in the present embodiment Parsing, first tracking information can be obtained;Here the first tracking information, it may include:Tracked destination object phase The various relative position informations using target following equipment as references object such as distance and angle for target following equipment.
Tracing figure picture based on collection in the step s 120, when extracting first tracking information, can parse it is described with Track image, determines the parameters such as image space, imaging area of the destination object in tracing figure picture, is then based on acquisition parameter (for example, collection direction, focusing position of tracking equipment collection image etc.) can converse the destination object relative to target The distance and/or angle of tracking equipment.
Gathering the camera of the tracing figure picture in certain embodiments can be imaged by depth camera and RGB (RGB) Head is collectively constituted.Depth camera and RGB cameras gather image respectively, the depth map then gathered in analysis depth camera When picture and the RGB image of RGB cameras collection, with reference to depth camera and the position relationship of RGB cameras, target pair is determined As the mapping relations on RGB image and depth image, and then the destination object is oriented relative to the tracking equipment First tracking information such as distance and/or angle.
Destination object can be the object of various tracking, specifically may include:The various objects being tracked such as human or animal, generally In the case of, the destination object can be life entity, and in some cases, the destination object can also be that another movement is set It is standby.Go to track another transportable equipment using tracking equipment, realize the tracking to equipment.
The wireless signal can be detected in step s 130, and the achievable mode of the step S130 can in the present embodiment Including a variety of:
The first:Receive the wireless signal for the transmitting equipment transmission that the destination object is carried with;
Second:Tracking equipment launches the first signal, detects the secondary signal returned based on first signal reflex;It is described First signal and the secondary signal are all one kind of wireless signal, and the target following equipment needs to detect the first signal function The secondary signal being reflected back after to destination object.
In a word, what is transmitted regardless of can all be received in step s 130 using which kind of mode at destination object is described wireless Signal.Then according to the reception parameter of the wireless signal, for example, the target pair can be oriented by receiving direction, phase etc. As the direction relative to the tracking equipment and the parameter of distance, for another example according to receive wireless signal signal intensity and/ Or receiving power, in conjunction with the emissive porwer and/or transmission power being known a priori by, institute can be determined using transmission loss model Destination object is stated relative to information such as the distances and/or angle of the tracking equipment, so as to obtain second tracking information.
In specific implementation process, it can be used based on the wireless signal transmitted between destination object and target following equipment The mode of Two-way side ranging (Two-way Ranging, TWR), is obtained between destination object and target following equipment to calculate Relative distance;TWR is a kind of method of bidirectional ranging, and two communication units are according to the time difference mutually sent and received signal The signal flight time is calculated, and the relative distance between communication unit can be calculated according to this.
Can also be based on the wireless signal transmitted between destination object and target following equipment, using reaching phase difference The mode of (Phase Difference of Arrival, PDOA), is obtained between destination object and target following equipment to calculate Relative distance and relative angle;PDOA abbreviations signal reaches phase difference, is a kind of method positioned using phase difference, leads to Cross the phase difference that measurement signal reaches monitoring station, it may be determined that signal source and the relative distance and angle of monitoring station.
There is no certain sequencing, the step between the step S110 and the step S130 in the present embodiment S110 can also make the step prior to the step S130 or the step S130 prior to the step S110 What S110 and step S130 was performed simultaneously.
The step S110 to step S120 can be what is performed repeatedly according to very first time interval in the present embodiment, described Step S130 to step S140 can be what is performed repeatedly according to the second time interval.Any phase of the very first time time interval Adjacent two time intervals can be with equal, so that property performance period is performed;Any two time interval of second time interval Can also be equal, also property performance period is performed.
Certainly, any very first time interval and/or second time interval, can also be unequal.Described first Time interval and second time interval can be determined dynamically.For example, the multiple acquisition first based on the historical juncture with When track information determines that the rate travel of the destination object meets first rate condition, the first scheduled rate threshold value is greater than When, then shorten the very first time interval, if being unsatisfactory for the first rate condition, can increase between the very first time Every.It is so obvious, the very first time interval be dynamically determined.
For another example the tracking information of multiple acquisition second based on the historical juncture determines that the rate travel of the destination object expires During the second rate conditions of foot, when being greater than the second scheduled rate threshold value, then shorten second time interval, if being unsatisfactory for institute The second rate conditions are stated, then can increase second time interval.So obvious, the second time interval is dynamically determined.
The usual step S150 execution frequency, is decided by the acquisition frequency of the first tracking information and the second tracking information Relatively low that.
Preferably, the step S110 to step S120 execution time interval is equal to the step S130 to step S140 Execution time interval, i.e., described first tracking information and second tracking information are synchronization gains.
Introduce two kinds of tracking modes to be tracked when carrying out destination object tracking first in the present embodiment, and Both tracking modes, one kind is vision tracking, and target following is carried out by the collection of image;Another is to be based on wireless signal Detection carry out destination object tracking;Final two class tracking informations (the i.e. first tracking obtained according to both tracking modes Information and the second tracking information), determine the final tracking information of destination object.
So first, when a tracking mode failure (destination object loss), another tracking mode can also be carried For tracking information, the probability that two kinds of tracking modes fail simultaneously for target following equipment is relative to one kind tracking failure Probability it is low, for target following equipment, reduce tracking failure probability, improve tracking success rate.
Secondly, the two kinds of tracking modes used in the present embodiment, one is vision tracking, another be wireless signal with Track, both tracking mode differences are big, go for different tracking scenes, in the case of vision tracking easily failure, The ability of tracking of wireless signal tracking is strong, in the case of wireless signal tracking easily failure, the vision tracking be probably with Track ability is strong, therefore reduce further two kinds of tracking modes while the problem of failing, improves the success rate of target following again.
In certain embodiments, two kinds of tracking modes can be utilized respectively respective track algorithm, determine that current time is It is no the invalid situation of tracking occur.
For example, in vision tracking, by detecting in the tracing figure picture currently gathered whether successfully navigate to destination object Imaging, determine whether vision tracking effective.
For example, in wireless signal tracking, by whether detecting the wireless signal in the current detection period, it is determined that Whether wireless signal tracking is effective.
For another example based on the first tracking information, determining distance and/or angle of the destination object relative to tracking equipment;When Distance is less than some apart from designated value, and/or, when angle is less than some angle designated value, it is believed that vision tracking is effective, no Then vision tracking failure.
For another example based on the second tracking information, determining distance and/or angle of the destination object relative to tracking equipment;When Distance is less than some apart from designated value, and/or, when angle is less than some angle designated value, it is believed that wireless signal tracking has Effect, otherwise wireless signal tracking failure.
Above is judge whether two kinds of tracking modes effective respectively, then whether respective tracking effective, it is determined that it is described with The mode of the final tracking information of track object.But when implementing, judge whether two kinds of tracking is effective to be lifted, this Embodiment provides a kind of preferred embodiment, as shown in Fig. 2 the step S150 may include:
Step S151:With reference to first tracking information and second tracking information, the vision tracking and institute are judged State whether wireless signal tracking occurs target loss;
Step S152:According to the judged result of the judgement, the final tracking information is determined.
Judging whether vision tracking and wireless signal tracking are effective in the present embodiment, can believe in combination with the first tracking Cease with the second tracking information to judge, with the accuracy rate of the judgement of lifting.Finally according to judged result, it is determined that final tracking letter Breath.Final tracking information described in the present embodiment, equally may include:Destination object relative to tracking equipment distance and/or The information of angle.
Specifically such as, the step S151 is specifically included:Calculate first tracking information and second tracking information The degree of approximation;
The step S152, specifically may include:The value for calculating the gained degree of approximation is compared with default first threshold Compared with, when the result of the comparison characterizes the degree of approximation and is not up to default first degree of approximation requirement, determine the vision with There is target loss at least one of track and wireless signal tracking.
Judge whether the degree of approximation is not up to the requirement of first degree of approximation in the present embodiment, then may include:The degree of approximation The alternate position spike indicated for two tracking informations, not less than one specific threshold of the alternate position spike, if less than specific threshold, it is believed that reach To the requirement of first degree of approximation, otherwise it is assumed that not up to described degree of approximation requirement.The specific threshold can be the one of the first threshold Kind.
The degree of approximation of the first tracking information and the second tracking information can be calculated in the step S151 of this implementation, and based near Seemingly spending the mode judged also has a variety of, several optional modes presented below:
Mode one:
If first tracking information and second tracking information all include:The destination object of various tracking mode detections Relative to the distance and angle of target following equipment;Calculate the degree of approximation, it may include:Range difference is calculated, differential seat angle is calculated, then Then it may include in step S152:Range difference is poor not less than pre-determined distance, and/or, differential seat angle is poor not less than predetermined angle, then can recognize There is target loss at least one of two kinds of tracking modes, i.e. tracking failure.
Mode two:The step S151 may include:
If first tracking information and second tracking information all include:The destination object of various tracking mode detections Relative to the distance and angle of target following equipment, the coordinate system using the target following equipment as origin is built, based on described First tracking information, determines the first coordinate value in the coordinate system;The coordinate system is determined according to second tracking information In the second coordinate value;Difference vector is built based on first coordinate value and second coordinate value, calculate the difference to The modulus value of amount.
The step S152 may include:When the modulus value is not less than pre- cover half threshold value, it is believed that vision is tracked and wireless At least one target of signal trace is lost.
In order to ensure the accuracy of judgement, a kind of preferred embodiment is provided in the present embodiment, can pass through multiple the of collection One tracking information and multiple second tracking informations, to judge whether to occur tracking failure.Specifically such as, the step S151 can be wrapped Include:
Based on first tracking information, calculate the destination object in the first preset time relative to the target with The first average distance and the first average angle of track equipment;In the first preset time, obtain at least twice described first with Track information;
Based on second tracking information, calculate the target device in the first preset time relative to the target with The second average distance and the second average angle of track equipment;Described second is obtained at least twice in first preset time Tracking information;
It is averaged based on first average distance, second average distance, first average angle and described second Angle, calculates the degree of approximation.
The first tracking information is so based on, destination object in the first preset time can be calculated and set relative to target following The first standby average distance and the first average angle;Based on the second tracking information, the first default interior target pair can be calculated As the first average distance and the second average angle relative to target following equipment.
Described two average distances of combination and two average angles, calculate the degree of approximation jointly in the present embodiment,
Specifically such as, first average distance is represented byFirst average angle is represented byIt is described Second average distance is represented bySecond average angle is represented byCalculate described near using equation below Like degree Sconf
In certain embodiments can also directly according to the average departure deviation of the first average distance and the second average distance, and The average angle of first average angle and the second average angle is poor, determines whether that at least one target is lost to combine.
In order to reduce amount of calculation, further determining that vision tracking effectively or wireless signal tracking failure, then can be with First determine whether corresponding tracking mode is effective based on a tracking information, if corresponding tracking mode is effectively, another Tracking mode fails.
First way, the step S152, in addition to:
The value for calculating the gained degree of approximation is compared with default first threshold, institute is characterized in the result of the comparison When stating the degree of approximation and being not up to the requirement of default first degree of approximation, directly determine that the vision tracking is invalid and the wireless signal with Track is effective.
The second way, the step S152, in addition to:
The value for calculating the gained degree of approximation is compared with default first threshold, institute is characterized in the result of the comparison When stating the degree of approximation and being not up to the requirement of default first degree of approximation, judge that second tracking information in the second preset time is It is no to there is continuity change, if it is judged that being yes, it is determined that the wireless signal tracks effective and described vision tracking nothing Effect, if it is judged that being no, it is determined that it is invalid that the vision tracks effective and described wireless signal tracking.
Judge that the second tracking information changes with the presence or absence of continuity in the present embodiment, be the movement based on destination object For continuity, it may include:
There is continuity change in the second tracking information, target following equipment can repeatedly obtain in the second preset time Two tracking informations, and the second tracking information repeatedly obtained changes, and thus maintains target following equipment and target Continuation tracking between object, and indicate within the acquisition time of multiple tracking information, the destination object is still in institute In the following range for stating target following equipment, therefore the current tracking to destination object of target following equipment is the effective of continuation Tracking.
In further embodiments, effectively judge accuracy to further lift tracking, sentence described in the present embodiment Disconnected second tracking information changes with the presence or absence of continuity, may also include:
According to second tracking information at current time, the destination object is obtained relative to the of the tracking equipment One current relative position parameter, and determine whether the first current relative position parameter meets the first preparatory condition, when described First current relative position parameter meets the first preparatory condition, it is determined that the vision tracking failure and wireless signal tracking Effectively.The first preparatory condition herein may include:Tracking equipment keeps itself predetermined speed model during lasting tracking Speed movement in enclosing, and the distance between the current bright destination object of first current relative position parameter list and tracking equipment are all the time Keep within the specific limits, not occurring suddenly apart from situation that is very big or being 0 relative to tracking equipment distance.
The degree of approximation can characterize the similarity of two tracking informations in the present embodiment.
The third mode, the step S152 may also include:
When the value of the degree of approximation is less than default first threshold, believed according to second tracking at current time Breath, obtains second current relative position parameter of the destination object relative to the tracking equipment, and determine that described second works as Whether preceding relative position parameter meets the second preparatory condition, when the second current relative position parameter meets the second default bar Part, it is determined that the vision tracks effective and described wireless signal tracking failure.
When implementing, the second way and the third described mode are used in combination.For example, in the second way Under, when judging that first current location is unsatisfactory for first preparatory condition, further sentenced using the third mode Disconnected, the first current relative position is unsatisfactory for the first preparatory condition, and during the second current relative position the second preparatory condition of satisfaction, really The fixed vision tracks effective and described wireless signal tracking failure.
In certain embodiments, the combined use of the second way and the third mode, may also include:
Second current relative position is unsatisfactory for the second preparatory condition, and the first current relative position meets the first preparatory condition When, determine that the vision tracks invalid and described wireless signal tracking effective.
In further embodiments,, can be according to vision when there is judging conflict during two ways combined use The confidence level that tracking and wireless signal are tracked, it is determined that final judged result.For example, it is assumed that the confidence level of vision tracking is higher than nothing The confidence level of line signal, then confirm to judge that vision tracking is effective and wireless signal tracks invalid judged result and judges knot to be final Really.Assuming that the confidence level of vision tracking is not higher than the confidence level of wireless signal, then confirm to judge vision tracking is invalid and wireless communication Number effective judged result of tracking is final judged result.
Here the confidence level of two kinds of tracking modes can be from other equipment receive or tracking equipment itself with The statistics of past tracking failure, it is determined that;For example, according to historical statistics result, the frequency of vision tracking failure is higher than wireless communication Number tracking the frequency, it is determined that vision tracking confidence level be less than wireless signal track tracking degree, otherwise determine wireless signal Confidence level be higher than vision track confidence level.
The confidence level can be the credible journey for characterizing the tracking parameter that correspondence tracking mode currently be provided in the present embodiment Degree.
Alternatively, it is described to determine whether the first relative position parameter meets the first preparatory condition, including:
The first distance between the first current relative position parameter and the first history relative position parameter is calculated, its In, the first history relative position parameter is the based on the detection of one or more historical junctures before the current time Relative position parameter of the destination object relative to the tracking equipment that two tracking informations are determined, or, first history Relative position parameter is that the second tracking information based at least two historical junctures detection before the current time is determined Average value of the destination object relative to the relative position parameter of the tracking equipment;
When the described first distance is less than the first particular value, determine that the first current relative position parameter meets described the One preparatory condition.
The movement of destination object is that have certain successional, and mutation drastically will not generally occur for rate travel, This principle is utilized in the present embodiment, and the first current location parameter currently calculated (at least may include:Current distance, another Some may also include in implementing:Current angular), compared with the first particular value, if less than the first particular value, it is believed that current the One current location parameter meets the first preparatory condition, otherwise it is believed that being unsatisfactory for.
For example, current time is t0, the historical juncture 1 is t1, and the historical juncture 2 is t2;T0 moment and t1 and t2 are calculated respectively The first distance, obtain two the first distances, two obtained the first distances calculated respectively and are compared with second particular value Relatively judge, in all first distance both less than first particular values, it may be determined that the first current location parameter meets described First preparatory condition.In certain embodiments, the second tracking information that can calculate current time detection is gone through with one or more The history moment corresponding relative position information of the second tracking information of detection obtains multiple first distances, multiple first apart from medium and small When the ratio of the second particular value is less than default ratio, determine that first current location parameter meets described first and preset Condition is effective equivalent to the determination wireless signal tracking.
In some embodiments, multiple second tracking informations are also based on, determine multiple historical junctures relative to The relative position parameter of track equipment, then calculates the average value of the relative position parameter of multiple historical junctures, then calculate first work as Front position parameter obtains first distance relative to the distance of average value, if the first distance is more than first particular value, It is believed that first current location parameter meets first preparatory condition.
Here the first particular value can be value set in advance, can be started to track destination object according to target following equipment What initial distance was determined.During tracking, the target following equipment dynamically can adjust the rate travel of itself to protect The uniformity with the rate travel of destination object is held, to cause the destination object all the time in its following range.Then now, institute It can be the c times of initial distance to state first threshold.Here c may be less than 1 positive integer, and value can take for 0.1,0.2 etc. Value.
It is described to determine whether the second current relative position parameter meets the second preparatory condition, including:
The second distance between the second current relative position parameter and the second history relative position parameter is calculated, its In, the second history relative position parameter is believed for the first tracking of the detection of previous historical juncture based on the current time Cease relative position parameter of the destination object relative to the tracking equipment determined, or the second history relative position ginseng Number is the target pair that the first tracking information based at least two historical junctures detection before the current time is determined As the average value of the relative position parameter relative to the tracking equipment;
When the second distance is less than second particular value, determine that the second current relative position parameter meets institute State the second preparatory condition.
Whether wireless signal is tracked in the present embodiment effectively judges, the company that can also be equally moved based on destination object Continuous property is judged.Therefore the determination of second particular value in the present embodiment, the second history relative parameter goes through with first History relative parameter determines similar, is not just repeated herein.
In certain embodiments in order to simplify judgement, first particular value and the second particular value can be with identical.At some In the case of, in order to accurately judge, first particular value can be what the precision tracked according to vision was determined, second particular value It can be determined according to the accuracy of wireless signal, then now, the second particular value and the first particular value may be different.
In embodiments of the present invention by the judgement of first preparatory condition and/or the second preparatory condition, to determine pair Whether the tracking mode answered is effective, and the tracking of the different tracking modes of target following equipment can also be determined in advance in some cases Effective grader, the grader can be neutral net or vectorial learning machine etc..Judging whether corresponding tracking mode has During effect, current time and the tracking information of several historical junctures detection adjacent with current time can be input to described Grader;The grader just provides corresponding judged result naturally, judges whether correspondence tracking mode has so as to reduce The amount of calculation of effect.
When training the grader, positive example sample can be used as by tracking effective tracking information and be trained, profit It is trained with the tracking information for tracking invalid as negative example sample, follow-up tracking equipment will be used as by the grader of checking Effective or failure judgement during tracking.
When one of which tracking mode occurs in that failure, that is, the destination object tracked is lost, and the tracking mode failed occurs Need to recover tracking again.Therefore in the present embodiment, methods described is further comprising the steps of.
During current one of tracking failure, then need according to the effective tracking mode of tracking so that invalid tracking mode The collection direction of adjustment tracking IMAQ, or detected wireless signals signal detection direction so that tracking recovers effective, And then ensure follow-up tracking precision.Specifically such as, methods described also includes one or more scenes:
Scene 1:Methods described includes:
When the vision tracks failure and wireless signal tracking is effective, the signal tracked based on the wireless signal The collection direction of direction and the vision tracking collection tracing figure picture is detected, the current collection side of the tracing figure picture is determined To so that vision tracking collects the tracing figure picture including the destination object.
Scene 2:Methods described also includes:
When the vision tracks effective and described wireless signal tracking failure, the signal tracked based on the wireless signal Detection direction and the collection direction of the vision tracking collection tracing figure picture, the signal inspection for determining the wireless signal tracking Survey direction.
Gather the direction that direction can be the camera for gathering the follower head picture;When the direction of the camera is moved, Then the picture material of the tracing figure picture of tracking IMAQ will change.
The wireless signal detected in step s 130 is transmitted from destination object position, if detecting the nothing The transmission direction of the antenna of the line signal dorsad wireless signal, then be likely to occur the phenomenon of wireless signal missing inspection.Therefore in this reality Applying can be according to collection direction and signal detection direction in example, it is determined whether need to adjust the collection direction of vision tracking and the first tune Whole parameter.Here the first adjusting parameter may include:First adjustment amount and the first adjustment direction.If current adjusting parameter can be wrapped Include:5 degree of adjustment amount, may also include:Adjustment direction, is to adjust 5 degree to the left or adjust 5 degree etc. to the right.
Certainly, if wireless signal tracking failure, detects direction according to current collection direction and wireless signal, determine Whether adjustment signal detection direction and second adjusting parameter are needed, and the second adjusting parameter here equally may include:Second adjustment Amount and the second adjustment direction.
In some cases, in order to simplify the control that the tracking equipment is tracked to vision and wireless signal is tracked, from one The signal detection direction for starting that the collection direction of the vision tracking will be caused to track with wireless signal is consistent.With convenient During follow-up one of failure, the calculating and direction adjustment of the adjusting parameter in collection direction and signal detection direction are reduced.For example, The detection antenna of camera and wireless signal is integrally disposed in a structure in the tracking equipment, the camera The signal detection direction for gathering direction and the wireless signal is always consistent.
Therefore in certain embodiments, when the signal detection direction is consistent with collection direction, the determination is currently gathered Direction, including:Maintain the current collection direction of the tracking equipment;When the signal detection direction is consistent with collection direction, The signal detection direction for determining the wireless signal tracking, including:Maintain the current demand signal detection side of the tracking equipment To.
In certain embodiments, the collection direction of camera and signal detection direction may be not consistent, then now demand Based on collection direction and signal detection direction, the determination of parameter is adjusted jointly.It is determined that adjusting parameter, with currently valid The adjusting parameter is determined in the corresponding collection direction of tracking mode or signal detection direction to adjust target.Specifically such as, institute is worked as When stating signal detection direction and inconsistent collection direction, the determination currently gathers direction, including:
Relative position parameter of the destination object relative to the tracking equipment is obtained based on second tracking information, Based on the relative position parameter and the signal detection direction, the current collection direction is determined;
When the signal detection direction and inconsistent collection direction, the signal detection side of the wireless signal tracking is determined To, including:Relative position parameter of the destination object relative to the tracking equipment is obtained based on first tracking information, Based on the relative position parameter and the collection direction, the signal detection direction is determined.
, it is necessary to be verified to the tracking after adjustment, really after direction adjustment or the adjustment of signal detection direction is gathered Set the tone it is whole after tracking whether recover effective.
Alternatively, the step S150 can, including it is following at least one:
When the judged result shows vision tracking failure and effective wireless signal tracking, described the is determined Two tracking informations are the final tracking information;
When the judged result shows effective vision tracking and the invalid wireless signal tracking, described the is determined One tracking information is the final tracking information;
When the judged result shows the vision tracking and effective wireless signal tracking, described first is determined Tracking information is the final tracking information;
When the judged result shows vision tracking and effective wireless signal tracking, with described first with Track information and second tracking information calculate the final tracking information for the dependent variable of preset function relation.
In a word, if tracking mode failure in step S150, another tracking mode effectively, then effectively to track Mode obtains tracking information as the final tracking parameter.
In step S150 when two tracking modes are all effective, the tracking information of two tracking modes can be combined, is led to The processing such as mean value computation is crossed, the tracking information is calculated.
In further embodiments, if two tracking modes are all effective, in order to reduce amount of calculation, can using confidence level compared with The tracking information of high tracking mode is directly as last tracking information.For example, when the wireless signal is tracked as UWB signal During tracking, vision tracking and UWB tracking it is all effective when, then can directly using UWB tracking tracking parameter as finally with Track information.In certain embodiments, the priority of two kinds of tracking modes can be preset, for example, UWB tracking and vision with In this tracking mode of track, the priority that vision can be set to track is higher than UWB priority, if then two kinds of tracking modes have During effect, the tracking information of the high tracking mode of priority is directly selected as last tracking information.
When one of which tracking mode occurs in that failure, that is, the destination object tracked is lost, and the tracking mode failed occurs Need to recover tracking again.
When the judged result shows vision tracking failure and effective wireless signal tracking, determine that the second tracking information is After final tracking information, methods described also includes:
On the one hand, during based on second tracking information tracking destination object, obtain view-based access control model tracking and obtain The first tracking information obtained, and the degree of approximation of first tracking information and second tracking information is calculated, gained will be calculated The value of the degree of approximation is compared with default Second Threshold, is reached in the result of the comparison sign degree of approximation default During the requirement of second degree of approximation, determine that the vision tracking is reverted to effectively.
On the other hand, when the judged result shows effective vision tracking and the invalid wireless signal tracking, Determine first tracking information for after the final tracking information, methods described also includes:
During based on first tracking information tracking destination object, obtain and track what is obtained based on wireless signal Second tracking information, and the degree of approximation of first tracking information and second tracking information is calculated, it is approximate by gained is calculated The value of degree is compared with default 3rd threshold value, and characterizing the degree of approximation in the result of the comparison reaches the default 3rd During degree of approximation requirement, determine that the wireless signal tracking is reverted to effectively.
The value of first threshold, Second Threshold and the 3rd threshold value can be the same or different in the present embodiment.Herein, The mode of the degree of approximation calculated based on the first tracking information and the second tracking information can judge whether occur at least one with foregoing The calculation that individual target is lost is identical.
In some cases, in order to reduce the calculation times of the degree of approximation, so that amount of calculation is reduced, can be according to described One tracking information or second tracking information, determine that destination object meets special relative to the relative position parameter of tracking equipment During fixed condition, the calculating of the foregoing degree of approximation is just triggered.For example, the corresponding degree of approximation is compared with Second Threshold or the 3rd threshold value, To reduce amount of calculation, reduce and recover checking number of times.
In certain embodiments, judge whether the vision tracking recovers effective, may also include:
When the vision track failure and the wireless signal tracking it is effective when, according to the wireless signal detect direction and The IMAQ direction of the vision tracking, determines the current acquisition parameter of described image collection;
Parse the current tracing figure picture based on the current acquisition parameter collection, the first image based on the destination object Feature, alternative image-region is extracted from the current tracing figure picture;
Extract the alternative characteristics of image of Drawing Object in the alternative image-region;
The matching degree of each alternative characteristics of image and the second characteristics of image of the destination object is calculated, wherein, it is described Second characteristics of image includes described first image feature;
When detecting the matching degree and being more than the alternative characteristics of image of the 3rd threshold value, determine that vision tracking is extensive It is multiple effective.
Described first image feature may include:The outline feature of the destination object of tracking, for example, tracking target is behaved, Then described first image feature can be the outward flange feature of human body type.Second characteristics of image may include that described first image is special Levy, but except described first image feature also includes more minutias, can be by destination object from multiple same types Object in identify.For example, it is traced the face mask feature of user, the tracked features of skin colors exclusively enjoyed, tracked pair The clothes and wearing feature of elephant.
In a word, described first image feature and the second characteristics of image, it may include:Contour feature, textural characteristics, color characteristic And appearance and size feature etc..
In the present embodiment after the collection direction adjustment that vision is tracked, the collection of image can be tracked again, is solved The tracing figure picture of collection is analysed, the alternative figure for needing further to recognize one by one is irised out from tracing figure picture using the first characteristics of image As region.For example, destination object is tracked people, potentially included in tracing figure picture:Many personal figures, herein outside also It may include:The figure of small handcart and counter, then at this point it is possible to which the outline feature based on people, will show in tracing figure picture The region segmentation of the figure of people carries out characteristics of image in further alternative image-region out as the alternative image-region Extract, obtain the characteristic value with the second Image Feature Matching, determine matching degree.Here matching degree may include:The match is successful The parameter such as ratio of the Characteristic Number in all second characteristics of image to be matched.
If the matching degree is more than the 4th threshold value, it is believed that destination object is detected in current tracing figure picture, can To think that current tracking recovers effective.
Fig. 3 is a tracing figure picture, and Fig. 4 marks out the schematic diagram of alternative image-region for the tracing figure picture shown in Fig. 3. The region that dashed rectangle circle is lived in Fig. 4 is the alternative image-region.
In certain embodiments, methods described also includes:
When vision tracking is effective and during wireless signal tracking failure, using true based on first tracking information Second tracking information of the signal detection angle detecting made, redefines the destination object relative relative to tracking equipment Location parameter, wherein, the relative position parameter includes:Relative distance and/or relative angle;
Judge whether the relative distance is less than the 4th threshold value, and/or, judge whether the relative angle is less than the 5th threshold Value;
When the relative distance be less than the 6th threshold value, and/or, the relative angle be less than seven threshold value when, Determine that the wireless signal tracking recovers effective.
In certain embodiments, methods described also includes:
When vision tracking is effective and during wireless signal tracking failure, using true based on first tracking information Second tracking information of the signal detection angle detecting made, redefines the destination object relative relative to tracking equipment Location parameter, wherein, the relative position parameter includes:Relative distance and/or relative angle;
Judge whether the relative distance is less than the 5th threshold value, and/or, judge whether the relative angle is less than the 6th threshold Value;
When the relative distance be less than the 5th threshold value, and/or, the relative angle be less than six threshold value when, Determine that the wireless signal tracking recovers effective.
In the present embodiment directly according to signal detection direction adjust after the second tracking information, determine destination object with The relative position parameter of tracking equipment, in the present embodiment if it is determined that relative distance is less than the 5th threshold value, and/or angle is less than 6th threshold value, then it is assumed that tracking recovers effective, otherwise needs to continue to adjust signal detection direction, to ensure that tracking is effective.
As shown in figure 5, the present embodiment provides a kind of target following equipment, including:
Collecting unit 110, for gathering tracing figure picture;
First acquisition unit 120, for based on the tracing figure picture, obtain first of the vision tracking to destination object with Track information;
Detection unit 130, the wireless signal of the target following is carried out for detecting;
Second acquisition unit 140, for parsing the wireless signal, obtain the destination object is carried out wireless signal with Second tracking information of track;
Determining unit 150, for reference to first tracking information and second tracking information, determining the target pair The final tracking information of elephant.
The tracking equipment can be various moveable for bottom surface mobile robot or flying robot etc. in the present embodiment Electronic equipment.
The collecting unit 110 may include in the present embodiment:The collection module of IMAQ, the collection can be carried out Module may include:One or more cameras for carrying out IMAQ.
The detection unit 130 may correspond to that antenna of wireless signal etc. can be received.
The first acquisition unit 120, second acquisition unit 140 and determining unit 150 may correspond to the tracking equipment Interior processor etc..The processor may include:Central processing unit, microprocessor, digital signal processor, programmable array or Application processor etc..
The processor can obtain first tracking information and the second tracking letter by the execution of computer program Breath, and finally determine the final tracking information of target following.
, the tracking success rate height few with BREAK TRACK phenomenon for the tracking equipment that the present embodiment is provided.
Alternatively, the determining unit 150, specifically for believing with reference to first tracking information and second tracking Breath, judges whether the vision tracking and wireless signal tracking occur target loss;According to the judgement knot of the judgement Really, the final tracking information is determined.
In certain embodiments, the determining unit 150, specifically for calculating first tracking information and described second The degree of approximation of tracking information;The value for calculating the gained degree of approximation is compared with default first threshold, in the comparison When as a result characterizing the degree of approximation and being not up to the requirement of default first degree of approximation, the vision tracking and the wireless signal are determined There is target loss at least one of tracking.
The determining unit 150 may correspond to calculator and the processor with computing function, can calculate described approximate Degree.
In certain embodiments, the determining unit 150, specifically for based on first tracking information, calculating described Destination object in the first preset time relative to the tracking equipment the first average distance and the first average angle;Based on institute The second tracking information is stated, the target device second average departure relative to the tracking equipment in the first preset time is calculated From with the second average angle;Based on first average distance, second average distance, first average angle and described Second average angle, calculates the degree of approximation.
Further, the determining unit 150, specifically for the value and default first threshold by the gained degree of approximation is calculated Value is compared, when the result of the comparison characterizes the degree of approximation and is not up to default first degree of approximation requirement, directly really It is effective that the fixed vision tracks invalid and described wireless signal tracking;Or, judge described second in the second preset time Tracking information changes with the presence or absence of continuity, if it is judged that being yes, it is determined that the wireless signal tracking is effective and described Vision tracking is invalid, if it is judged that being no, it is determined that it is invalid that the vision tracks effective and described wireless signal tracking.
In certain embodiments, the determining unit 150, also particularly useful for the value by the gained degree of approximation is calculated with presetting First threshold be compared, characterizing the degree of approximation in the result of the comparison is not up to default first degree of approximation requirement When, according to second tracking information at current time, obtaining the destination object ought relative to the first of the tracking equipment Preceding relative position parameter, and determine whether the first current relative position parameter meets the first preparatory condition, when described first Current relative position parameter meets the first preparatory condition, it is determined that the vision tracking is failed and wireless signal tracking has Effect;And/or, when the value of the degree of approximation is less than default first threshold, believed according to second tracking at current time Breath, obtains second current relative position parameter of the destination object relative to the tracking equipment, and determine that described second works as Whether preceding relative position parameter meets the second preparatory condition, when the second current relative position parameter meets the second default bar Part, it is determined that the vision tracks effective and described wireless signal tracking failure.
Alternatively, the determining unit 150, specifically for calculating the first current relative position parameter and the first history The first distance between relative position parameter, wherein, the first history relative position parameter be based on the current time with The destination object that second tracking information of preceding detection of one or more historical junctures is determined is relative to the tracking equipment Relative position parameter, or the first history relative position parameter be based at least two history before the current time The destination object that second tracking information of moment detection is determined is flat relative to the relative position parameter of the tracking equipment Average.
When the described first distance is less than the first particular value, determine that the first current relative position parameter meets described the One preparatory condition.
The determining unit 150, also particularly useful for calculating, the second current relative position parameter is relative with the second history Second distance between location parameter, wherein, the second history relative position parameter be based on the current time before Phase of the destination object relative to the tracking equipment that first tracking information of one or more historical juncture detections is determined To location parameter, or the second history relative position parameter is based at least two historical junctures before the current time Average value of the destination object relative to the relative position parameter of the tracking equipment that first tracking information of detection is determined; When the second distance is less than the second particular value, determine that the second current relative position parameter meets the described second default bar Part.
In certain embodiments, the determining unit 150, for when the signal detection direction with collection direction it is consistent When, the current collection direction of the tracking equipment is maintained, or maintain the current demand signal detection direction of the tracking equipment
In addition, the determining unit, is additionally operable to when the signal detection direction and inconsistent collection direction, based on described Second tracking information obtains relative position parameter of the destination object relative to the tracking equipment, based on the relative position Parameter and the signal detection direction, determine the current collection direction, or, obtain the mesh based on first tracking information Relative position parameter of the object relative to the tracking equipment is marked, based on the relative position parameter and the collection direction, really The fixed signal detection direction.
The determining unit 150 can be determined whether to need adjustment according to current collection direction and signal detection direction Direction and/or signal detection direction are gathered, so that the tracking of the tracking mode to recover failure is effective.
The determining unit 150, specifically for performing at least one of:
When the vision tracks failure and wireless signal tracking is effective, determine that second tracking information is described Final tracking information;
When the vision, which tracks effective and described wireless signal, tracks invalid, determine that first tracking information is described Final tracking information;
When vision tracking and effective wireless signal tracking, determine first tracking information for described in most Whole tracking information;
When vision tracking and effective wireless signal tracking, with first tracking information and described second Tracking information calculates the final tracking information for the dependent variable of preset function relation.
In certain embodiments, the determining unit 150, specifically for tracking failure and based on described in the vision During second tracking information tracking destination object, the first tracking information that view-based access control model tracking is obtained is obtained, and calculate institute The degree of approximation of the first tracking information and second tracking information is stated, the value and default second threshold of the gained degree of approximation will be calculated Value is compared, when the result of the comparison characterizes the degree of approximation and reaches default second degree of approximation requirement, it is determined that described Vision tracking is reverted to effectively.
Further, the determining unit 150, is additionally operable to track effective and described wireless signal tracking nothing in the vision During effect, during based on first tracking information tracking destination object, for tracking and obtaining based on wireless signal is obtained Two tracking informations, and the degree of approximation of first tracking information and second tracking information is calculated, the gained degree of approximation will be calculated Value be compared with default 3rd threshold value, characterizing the degree of approximation in the result of the comparison reaches that the default 3rd is near When being required like degree, determine that the wireless signal tracking is reverted to effectively.
For example, the determining unit 150, failure is tracked and wireless signal tracking is effective available for when the vision When, the collection side of the signal detection direction tracked based on the wireless signal and the vision tracking collection tracing figure picture To it is determined that currently gather direction, so that vision tracking collects the tracing figure picture including the destination object, or, working as institute When stating effective vision tracking and the wireless signal tracking failure, the signal detection direction tracked based on the wireless signal and institute State the collection direction of the vision tracking collection tracing figure picture, determine the signal detection direction of the wireless signal tracking;Then By gathering the adjustment in direction or signal detection direction so that the tracking of corresponding tracking mode effectively, then passes through the meter of the degree of approximation Calculation verifies whether to recover effective.
The determining unit 150, equally may correspond to processor or process circuit in the present embodiment, described by adjustment Direction or signal detection direction are gathered, the tracking mode of failure can be made to recover effective again.
Further, the determining unit 150, for specifically for when the signal detection direction with collection direction it is consistent When, the current collection direction of the tracking equipment is maintained, or, when the signal detection direction is consistent with collection direction, maintain The current demand signal detection direction of the tracking equipment;And/or, the recovery unit, also particularly useful for when the signal detection side To with collection direction it is inconsistent when, the destination object is obtained relative to the tracking equipment based on second tracking information Relative position parameter, based on the relative position parameter and the signal detection direction, determines the current collection direction;Or, When the signal detection direction and inconsistent collection direction, the destination object is obtained based on first tracking information relative In the relative position parameter of the tracking equipment, based on the relative position parameter and the collection direction, the signal is determined Detect direction.
Except verifying whether to recover effective technical scheme, in certain embodiments, the tracking equipment also includes:
First authentication unit, for when the vision tracks failure and wireless signal tracking is effective, according to described Wireless signal detects the IMAQ direction of direction and vision tracking, determines the current acquisition parameter of described image collection; Current tracing figure picture of the parsing based on the current acquisition parameter collection, the first characteristics of image based on the destination object, from Alternative image-region is extracted in the current tracing figure picture;Extract the alternative image of Drawing Object in the alternative image-region Feature;The matching degree of each alternative characteristics of image and the second characteristics of image of the destination object is calculated, wherein, described second Graphic feature includes described first image feature;When detect the matching degree be more than the 4th threshold value the alternative characteristics of image When, determine that the vision tracking recovers effective.
Further, the tracking equipment, in addition to:Second authentication unit, for when vision tracking is effective and institute State wireless signal tracking failure when, using the signal detection angle detecting determined based on first tracking information second with Track information, redefines relative position parameter of the destination object relative to tracking equipment, wherein, the relative position parameter Including:Relative distance and/or relative angle;Judge whether the relative distance is less than the 5th threshold value, and/or, judge the phase Whether the 6th threshold value is less than to angle;When the relative distance be less than the 7th threshold value, and/or, the relative angle be less than the 8th During threshold value, determine that the wireless signal tracking recovers effective.
First authentication unit and the second authentication unit can correspond to processor etc., be believed according to the tracking of detection Breath, by the calculating of the first distance and/or second distance, whether the tracking of checking correspondence tracking mode is effective or recovers effective.
As shown in fig. 6, the present embodiment provides a kind of tracking equipment, including:
IMAQ module 210, for gathering tracing figure picture;
Antenna modules 220, for detected wireless signals;
Memory 230, for the computer program that is stored with;
Processor 240, is connected with described image collection module 210, antenna modules 220 and memory 230, for leading to respectively Cross and perform the method for tracking target that the computer program performs foregoing any one embodiment offer.
Described image gathers module 210, it may include one or more cameras.
The antenna modules 220 may include that one or more can receive the antenna of wireless signal.
The memory 230 may include various storage mediums, for example, non-moment storage medium, available for storing the meter Calculation machine program.
The processor 240 can be central processing unit, microprocessor, application processor, digital signal processor or can compile Journey array etc., can gather module 210, antenna modules 220 and memory 230 by bus 250 and described image and be connected.It is described total Line can be integrated circuit (IIC) bus etc..
The processor 240 can be performed any one foregoing technical scheme and be carried by the execution of computer program The method for tracking target of confession, method that specifically can be as shown in Fig. 1 and/or Fig. 2.
The embodiment of the present invention also provides a kind of computer-readable storage medium, and the computer-readable storage medium is stored with computer journey Sequence;After the computer program is executed by processor, the method for tracking target that any one foregoing embodiment is provided is able to carry out.
Here computer-readable storage medium can be non-moment storage medium, concretely CD, flash memory or mobile hard disk etc. Various types of storage mediums.
Specific example is provided below in conjunction with above-described embodiment:
Example 1:
Track destination object and give tracking object for change again:Refer to during long-time tracks a destination object, Because tracker (tracker) loses the tracking destination object set caused by various interference or disturbed object is taken away, need The tracking destination object initially set is picked up, and recovers normal tracking.For example, vision track during, by with Chaff interference is inserted between the destination object and tracking equipment of track, causes the target image of collection to be disappeared from tracing figure picture.Or Person, destination object enters a highlights, and light is too strong, and tracking equipment is also without can be timely bright according to current environment The acquisition parameter of the collection brightness and acquisition for contrast etc. of degree adjustment itself, so that it is exposed to cause collection tracing figure picture to occur Degree, and then lead to not phenomenon that destination object is successfully extracted from tracing figure picture.
View-based access control model tracking can be according to the destination object template defined in initial frame picture, and training pattern is used to subsequently regard The tracking of the destination object of this in frequency, and model is constantly updated in tracking process, to reach that adapting to destination object gestures of object changes, And the purpose for overcoming complex background to disturb.Goal object template may include:One or more images of destination object Feature;Or the Feature Selection Model after destination object imaging etc..
Because without off-line training, such technology has very high versatility, and any object that can be specified to user enters Line trace;But, a subject matter for prolonged track algorithm is can not to judge target exactly during tracking Whether object is lost, and gives the tracking destination object initially set for change exactly after destination object loss.Moreover, pure be based on regarding Feel that the destination object of tracking is given for change and easily disturbed by similar purpose object, environment again, it is difficult to formed the long-time of robust with Track system.
In this example on the basis of vision tracking, a kind of UWB tracking is introduced, to aid in vision to track, to reduce Destination object is by the phenomenon with losing.
UWB is a kind of no-load communication techniques, and data are transmitted using nanosecond to the non-sinusoidal waveform burst pulse of picosecond level, can Closely precision indoor positioning or target following are done using its subnanosecond level Ultra-short pulse, with strong anti-interference performance, no-load Ripple, transmitting power of equipment be low and the features such as high tracking accuracy, tracking accuracy can be up to 10 cms.
Tracked based on UWB and vision tracks the method for tracking target combined, for the destination object during solution tracking Lose and give problem for change again.
1) destination object being followed is needed to carry UWB beacons, starts after tracking, and the acquisition vision tracked by vision is tracked Information, control machine people carries out vision tracking to destination object, and control signal includes the destination object distance of each moment t distancesWith destination object and the relative angle of robotUWB beacons are the transmitting equipment for sending UWB signal.
2) during vision is tracked, each moment equally gathers the distance of UWB beacon distances robotWith it is relative Angle
3) each moment verification UWB tracking and vision track the distance and angle provided, and are entered respectively according to check results Row processing, specific step is as follows:
Average distance and average angle that the UWB and vision algorithm at N number of moment in the past are obtained are calculated respectivelyIt is describedThe average distance tracked for UWB;It is describedFor putting down that vision is tracked Equal distance;It is describedThe average angle tracked for vision;It is describedThe average angle tracked for UWB.
Calculate the uniformity S of two groups of distances and angleconf, computational methods are as follows:
The S that upper formula is calculatedconfThe destination object that is actually calculated respectively by UWB and vision is in space Positional distance, threshold value is set for the distanceTo judge whether two groups of data are consistent, work as SconfMore than threshold valueWhen, it is believed that mesh Object is marked with mistake;Here consistency desired result can actually cover the invalid situation of one of which tracking data, such as regard Destination object in tracking is felt once losing, and the distance and angle now obtained will remain 0.Now, obtained S is calculatedconf Can be very big;The consistency desired result so designed, actually can be good at solving destination object in pure vision tracking and accurately sentences to lose The problem of, for the system that single vision is tracked, most of templates based on online updating, after destination object is with mistake, i.e., Taken away by mistakenly destination object, the information of simple increase view-based access control model be difficult to solve it is this with wrong problem, it is and excessively complicated Calculating be also difficult to apply in real-time system, and propose here position verification make use of extra sensor information, several In the case of not increasing computation complexity, such case can be judged very accurately;
When thinking that deviation occurs in tracking destination object, the validity of two groups of tracking datas is verified first, if UWB data have Effect (successive frame has change), then enable UWB tracking, useWithVerified, continued to satisfaction as tracking information ConditionWhen, unlatching vision tracks the checking to following destination object.Herein, the meaning of threshold value is set for angle It is, when robot is correct, when stably tracking destination object, its posture should be just to tracked target object, now, acquisition Angle should be close to 0.It is describedIt is typically less than describedIn this example equivalent to vision tracking failure after, again , can be in the trail angle tracked based on currently valid UWB in order to reduce checking number of times when whether verification vision tracking recovers effective When degree is less than particular value, show the angle very little of destination object and tracking equipment, the probability in tracing figure picture occurs in destination object It is higher, so as to reduce the checking number of times that tracking recovers.
After UWB tenacious tracking destination objects, enable visual tracking method and verified, use the target pair of view-based access control model As recognition methods again, verify whether the destination object of tracking is consistent with tracking destination object before, after confirming unanimously, recover just Normal vision algorithm tracking mode;
The method for tracking target that vision tracking and UWB tracking are combined is provided below in conjunction with Fig. 7, including:
Step S1:Selected target object simultaneously determines whether destination object has carried target beacon;
Step S2:Carry out vision tracking and UWB tracking;
Step S3:Carry out vision tracking and UWB tracks respective tracking range and tracking angle verification;
Step S4:Result based on verification, whether judge tracking is same object, if return to step S2, if not Into step S5:
Step S5:Keep UWB tracking;
Step S6:Based on the UWB tracking ranges tracked and tracking angle, vision tracking is verified;
Step S7:Whether judge tracking is same object, if into step S2, return to step S5 if not.
Example 2:
Step 1:Ground robot tracking pedestrians.
Step 2:Ground robot carries UWB modules and depth camera and RGB cameras;
Step 3:Tracked destination object carries UWB beacons;
Step 4:The frame of video gathered using RGB (RGB) camera is tracked, according to destination object in the picture Position calculate destination object opposed robots angle, depth camera obtain depth information be used for calculate destination object and The distance of robot;
Step 4:Vision tracking is also contains using Vision Tracking or trace model during destination object tracking Various states, such as destination object are lost, destination object is given for change again, when destination object is lost, the distance and angle provided It is 0, destination object gives the destination object checking after being tracked for short time UWB for change again.
In this example, track and combine using UWB tracking and vision, solve in pure Visual Tracking System, it is impossible to accurate The problem of whether destination object is lost really judged;Vision tracking is supplemented using the tracking of UWB signal short time, when solving long Between track during, the problem of destination object gives difficulty for change again.
, can be by it in several embodiments provided herein, it should be understood that disclosed apparatus and method Its mode is realized.Apparatus embodiments described above are only schematical, for example, the division of the unit, is only A kind of division of logic function, can have other dividing mode, such as when actually realizing:Multiple units or component can be combined, or Another system is desirably integrated into, or some features can be ignored, or do not perform.In addition, shown or discussed each composition portion Coupling point each other or direct-coupling or communication connection can be the INDIRECT COUPLINGs of equipment or unit by some interfaces Or communication connection, can be electrical, machinery or other forms.
The above-mentioned unit illustrated as separating component can be or may not be it is physically separate, it is aobvious as unit The part shown can be or may not be physical location, you can positioned at a place, can also be distributed to multiple network lists In member;Part or all of unit therein can be selected to realize the purpose of this embodiment scheme according to the actual needs.
In addition, each functional unit in various embodiments of the present invention can be fully integrated into a processing module, also may be used Be each unit individually as a unit, can also two or more units it is integrated in a unit;It is above-mentioned Integrated unit can both be realized in the form of hardware, it would however also be possible to employ hardware adds the form of SFU software functional unit to realize.
One of ordinary skill in the art will appreciate that:Realizing all or part of step of above method embodiment can pass through Programmed instruction related hardware is completed, and foregoing program can be stored in a computer read/write memory medium, the program Upon execution, the step of including above method embodiment is performed.
The foregoing is only a specific embodiment of the invention, but protection scope of the present invention is not limited thereto, any Those familiar with the art the invention discloses technical scope in, change or replacement can be readily occurred in, should all be contained Cover within protection scope of the present invention.Therefore, protection scope of the present invention should be based on the protection scope of the described claims.

Claims (19)

1. a kind of method for tracking target, applied to target following equipment, it is characterised in that including:
Gather tracing figure picture;
Based on the tracing figure picture, the first tracking information of the vision tracking to destination object is obtained;
Detection carries out the wireless signal of the target following;
The wireless signal is parsed, the second tracking information that wireless signal tracking is carried out to the destination object is obtained;
With reference to first tracking information and second tracking information, the final tracking information of the destination object is determined.
2. according to the method described in claim 1, it is characterised in that
First tracking information with reference to described in and second tracking information, determine the final tracking letter of the destination object Breath, including:
With reference to first tracking information and second tracking information, the vision tracking and wireless signal tracking are judged Whether target loss is occurred;
According to the judged result of the judgement, the final tracking information is determined.
3. method according to claim 2, it is characterised in that
First tracking information with reference to described in and second tracking information, judge the vision tracking and the wireless signal Whether tracking occurs target loss, including:
Calculate the degree of approximation of first tracking information and second tracking information;
The judged result according to the judgement, determines the final tracking information, including:
The value for calculating the gained degree of approximation is compared with default first threshold, characterized in the result of the comparison described near During like spending not up to default first degree of approximation requirement, at least one of the vision tracking and wireless signal tracking are determined There is target to lose.
4. method according to claim 3, it is characterised in that
The degree of approximation for calculating first tracking information and second tracking information, including:
Based on first tracking information, calculate the destination object and set in the first preset time relative to the target following The first standby average distance and the first average angle;
Based on second tracking information, calculate the target device and set in the first preset time relative to the target following The second standby average distance and the second average angle;
Based on first average distance, second average distance, first average angle and second average angle, Calculate the degree of approximation.
5. the method according to claim 3 or 4, it is characterised in that
It is described to be compared the value for calculating the gained degree of approximation with default first threshold, characterize institute in the result of the comparison When stating the degree of approximation and being not up to the requirement of default first degree of approximation, the vision tracking and wireless signal tracking are determined at least One there is target loss, in addition to:
Directly determine that the vision tracks invalid and described wireless signal tracking effective;
Or,
Judge that second tracking information in the second preset time changes with the presence or absence of continuity, if it is judged that being It is, it is determined that wireless signal tracking is effective and the vision tracking is invalid, if it is judged that being no, it is determined that described to regard It is invalid that feel tracks effective and described wireless signal tracking.
6. method according to claim 2, it is characterised in that
Methods described also includes at least one of:
When the judged result characterization of visual tracks failure and wireless signal tracking is effective, tracked based on the wireless signal Signal detection direction and the collection direction of the vision tracking collection tracing figure picture, determine that the current of the tracing figure picture is adopted Collect direction, so that vision tracking collects the tracing figure picture including the destination object;
When judged result characterization of visual tracking is effective and wireless signal tracking is failed, tracked based on the wireless signal Signal detection direction and the collection direction of the vision tracking collection tracing figure picture, the letter for determining the wireless signal tracking Number detection direction.
7. method according to claim 6, it is characterised in that
When the signal detection direction is consistent with collection direction, the determination currently gathers direction, including:
Maintain the current collection direction of the target following equipment;
When the signal detection direction is consistent with collection direction, the signal detection side of the determination wireless signal tracking To, including:
The current demand signal of the target following equipment is maintained to detect direction;
When the signal detection direction and inconsistent collection direction, the determination currently gathers direction, including:
Relative position parameter of the destination object relative to the target following equipment is obtained based on second tracking information, Based on the relative position parameter and the signal detection direction, the current collection direction is determined;
When the signal detection direction and inconsistent collection direction, the signal detection direction of the wireless signal tracking is determined, Including:
Relative position parameter of the destination object relative to the target following equipment is obtained based on first tracking information, Based on the relative position parameter and the collection direction, the signal detection direction is determined.
8. the method according to claim 2 to 4 and 6 to 7 any one, it is characterised in that
It is described that the final tracking information is determined according to the judged result, including it is following at least one:
When the vision tracks failure and wireless signal tracking is effective, it is described final to determine second tracking information Tracking information;
When the vision, which tracks effective and described wireless signal, tracks invalid, it is described final to determine first tracking information Tracking information;
When vision tracking and effective wireless signal tracking, determine first tracking information for it is described it is final with Track information;
When vision tracking and effective wireless signal tracking, with first tracking information and second tracking Information calculates the final tracking information for the dependent variable of preset function relation.
9. method according to claim 8, it is characterised in that
The vision track failure and wireless signal tracking it is effective when, determine the second tracking information for final tracking information it Afterwards, methods described also includes:
During based on second tracking information tracking destination object, the first tracking that view-based access control model tracking is obtained is obtained Information, and the degree of approximation of first tracking information and second tracking information is calculated, the value of the gained degree of approximation will be calculated It is compared with default Second Threshold, characterizing the degree of approximation in the result of the comparison reaches that default second degree of approximation will When asking, determine that the vision tracking is reverted to effectively.
10. method according to claim 8, it is characterised in that
When the vision tracks effective and described wireless signal and tracks invalid, it is described final to determine first tracking information After tracking information, methods described also includes:
During based on first tracking information tracking destination object, second for tracking and obtaining based on wireless signal is obtained Tracking information, and the degree of approximation of first tracking information and second tracking information is calculated, the gained degree of approximation will be calculated Value is compared with default 3rd threshold value, and characterizing the degree of approximation in the result of the comparison reaches that the default 3rd is approximate When degree is required, determine that the wireless signal tracking is reverted to effectively.
11. a kind of target following equipment, it is characterised in that including:
Collecting unit, for gathering tracing figure picture;
First acquisition unit, for based on the tracing figure picture, obtaining the first tracking information of the vision tracking to destination object;
Detection unit, the wireless signal of the target following is carried out for detecting;
Second acquisition unit, for parsing the wireless signal, obtains and the of wireless signal tracking is carried out to the destination object Two tracking informations;
Determining unit, for reference to first tracking information and second tracking information, determining the destination object most Whole tracking information.
12. target following equipment according to claim 11, it is characterised in that
The determining unit, specifically for reference to first tracking information and second tracking information, judging the vision Whether tracking and wireless signal tracking occur target loss;According to the judged result of the judgement, determine described final Tracking information.
13. target following equipment according to claim 12, it is characterised in that
The determining unit, the degree of approximation specifically for calculating first tracking information and second tracking information;Will meter The value for calculating the gained degree of approximation is compared with default first threshold, and characterizing the degree of approximation in the result of the comparison does not reach When being required to default first degree of approximation, determine that the vision tracking target occurs with least one that the wireless signal is tracked Lose.
14. target following equipment according to claim 13, it is characterised in that
The determining unit, specifically for based on first tracking information, calculating the destination object in the first preset time Interior the first average distance and the first average angle relative to the tracking equipment;Based on second tracking information, institute is calculated State target device second average distance and the second average angle relative to the tracking equipment in the first preset time;It is based on First average distance, second average distance, first average angle and second average angle, calculate described The degree of approximation.
15. the target following equipment according to claim 13 or 14, it is characterised in that
The determining unit, specifically for the value for calculating the gained degree of approximation is compared with default first threshold, in institute When stating result of the comparison and characterizing the degree of approximation and be not up to default first degree of approximation requirement, the vision tracking nothing is directly determined Imitate and wireless signal tracking is effective;Or, judge that second tracking information in the second preset time whether there is Continuity changes, if it is judged that be yes, it is determined that wireless signal tracking is effective and the vision track it is invalid, if Judged result is no, it is determined that it is invalid that the vision tracks effective and described wireless signal tracking.
16. the target following equipment according to claim 12,13 or 14, it is characterised in that
The determining unit, specifically for performing at least one of:
When the vision tracks failure and wireless signal tracking is effective, it is described final to determine second tracking information Tracking information;
When the vision, which tracks effective and described wireless signal, tracks invalid, it is described final to determine first tracking information Tracking information;
When vision tracking and effective wireless signal tracking, determine first tracking information for it is described it is final with Track information;
When vision tracking and effective wireless signal tracking, with first tracking information and second tracking Information calculates the final tracking information for the dependent variable of preset function relation.
17. target following equipment according to claim 16, it is characterised in that
The determining unit, specifically for tracking failure in the vision and tracking target pair based on second tracking information As during, the first tracking information that view-based access control model tracking is obtained is obtained, and calculate first tracking information and described the The degree of approximation of two tracking informations, the value for calculating the gained degree of approximation is compared with default Second Threshold, in the comparison Result when characterizing the degree of approximation and reaching default second degree of approximation requirement, determine that the vision tracking is reverted to effectively;
When the vision tracks effective and described wireless signal and tracks invalid, target is being tracked based on first tracking information During object, obtain based on wireless signal track obtain the second tracking information, and calculate first tracking information and The degree of approximation of second tracking information, the value for calculating the gained degree of approximation is compared with default 3rd threshold value, in institute When stating result of the comparison and characterizing the degree of approximation and reach default degree of approximation requirement, determine that the wireless signal tracking has been reverted to Effect.
18. a kind of target following equipment, it is characterised in that including:
IMAQ module, for gathering tracing figure picture;
Antenna modules, for detected wireless signals;
Memory, for the computer program that is stored with;
Processor, is connected with described image collection module, antenna modules and memory, for by performing the computer respectively Program execution profit requires the method described in 1 to 10 any one.
19. a kind of computer-readable storage medium, the computer-readable storage medium is stored with computer program;The computer program quilt After computing device, the method described in any one of claim 1 to 10 is able to carry out.
CN201710374093.8A 2017-05-24 2017-05-24 Method for tracking target, target following equipment and computer storage medium Active CN107255468B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201710374093.8A CN107255468B (en) 2017-05-24 2017-05-24 Method for tracking target, target following equipment and computer storage medium
PCT/CN2018/088020 WO2018214909A1 (en) 2017-05-24 2018-05-23 Target tracking method, target tracking device, and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710374093.8A CN107255468B (en) 2017-05-24 2017-05-24 Method for tracking target, target following equipment and computer storage medium

Publications (2)

Publication Number Publication Date
CN107255468A true CN107255468A (en) 2017-10-17
CN107255468B CN107255468B (en) 2019-11-19

Family

ID=60027992

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710374093.8A Active CN107255468B (en) 2017-05-24 2017-05-24 Method for tracking target, target following equipment and computer storage medium

Country Status (2)

Country Link
CN (1) CN107255468B (en)
WO (1) WO2018214909A1 (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108764167A (en) * 2018-05-30 2018-11-06 上海交通大学 A kind of target of space time correlation recognition methods and system again
CN108820215A (en) * 2018-05-21 2018-11-16 南昌航空大学 A kind of automatic air-drop unmanned plane of autonomous searching target
WO2018214909A1 (en) * 2017-05-24 2018-11-29 纳恩博(北京)科技有限公司 Target tracking method, target tracking device, and computer storage medium
CN109156948A (en) * 2018-08-21 2019-01-08 芜湖职业技术学院 Automatically umbrella is followed
CN109445465A (en) * 2018-10-17 2019-03-08 深圳市道通智能航空技术有限公司 Method for tracing, system, unmanned plane and terminal based on unmanned plane
CN109460077A (en) * 2018-11-19 2019-03-12 深圳博为教育科技有限公司 A kind of automatic tracking method, automatic tracking device and automatic tracking system
CN109658434A (en) * 2018-12-26 2019-04-19 成都纵横自动化技术股份有限公司 A kind of method and device of target following
CN109828596A (en) * 2019-02-28 2019-05-31 深圳市道通智能航空技术有限公司 A kind of method for tracking target, device and unmanned plane
CN110418114A (en) * 2019-08-20 2019-11-05 京东方科技集团股份有限公司 A kind of method for tracing object, device, electronic equipment and storage medium
CN111862154A (en) * 2020-07-13 2020-10-30 中移(杭州)信息技术有限公司 Robot vision tracking method and device, robot and storage medium
CN112567201A (en) * 2018-08-21 2021-03-26 深圳市大疆创新科技有限公司 Distance measuring method and apparatus
CN112922882A (en) * 2019-12-06 2021-06-08 佛山市云米电器科技有限公司 Fan control method, fan and computer readable storage medium
CN113674309A (en) * 2020-05-14 2021-11-19 杭州海康威视系统技术有限公司 Object tracking method, device, management platform and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105488815A (en) * 2015-11-26 2016-04-13 北京航空航天大学 Real-time object tracking method capable of supporting target size change
CN106683123A (en) * 2016-10-31 2017-05-17 纳恩博(北京)科技有限公司 Method and device for tracking targets

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI388205B (en) * 2008-12-19 2013-03-01 Ind Tech Res Inst Method and apparatus for tracking objects
CN105629196B (en) * 2016-01-07 2018-05-25 观宇能源科技(上海)有限公司 Alignment system and correlation method based on computer vision and dynamic fingerprint
CN105915784A (en) * 2016-04-01 2016-08-31 纳恩博(北京)科技有限公司 Information processing method and information processing device
CN105973228A (en) * 2016-06-28 2016-09-28 江苏环亚医用科技集团股份有限公司 Single camera and RSSI (received signal strength indication) based indoor target positioning system and method
CN107255468B (en) * 2017-05-24 2019-11-19 纳恩博(北京)科技有限公司 Method for tracking target, target following equipment and computer storage medium

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105488815A (en) * 2015-11-26 2016-04-13 北京航空航天大学 Real-time object tracking method capable of supporting target size change
CN106683123A (en) * 2016-10-31 2017-05-17 纳恩博(北京)科技有限公司 Method and device for tracking targets

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018214909A1 (en) * 2017-05-24 2018-11-29 纳恩博(北京)科技有限公司 Target tracking method, target tracking device, and computer storage medium
CN108820215A (en) * 2018-05-21 2018-11-16 南昌航空大学 A kind of automatic air-drop unmanned plane of autonomous searching target
CN108820215B (en) * 2018-05-21 2021-10-01 南昌航空大学 Automatic air-drop unmanned aerial vehicle capable of automatically searching target
CN108764167B (en) * 2018-05-30 2020-09-29 上海交通大学 Space-time correlated target re-identification method and system
CN108764167A (en) * 2018-05-30 2018-11-06 上海交通大学 A kind of target of space time correlation recognition methods and system again
CN109156948A (en) * 2018-08-21 2019-01-08 芜湖职业技术学院 Automatically umbrella is followed
CN112567201B (en) * 2018-08-21 2024-04-16 深圳市大疆创新科技有限公司 Distance measuring method and device
CN112567201A (en) * 2018-08-21 2021-03-26 深圳市大疆创新科技有限公司 Distance measuring method and apparatus
CN109445465A (en) * 2018-10-17 2019-03-08 深圳市道通智能航空技术有限公司 Method for tracing, system, unmanned plane and terminal based on unmanned plane
CN109460077A (en) * 2018-11-19 2019-03-12 深圳博为教育科技有限公司 A kind of automatic tracking method, automatic tracking device and automatic tracking system
CN109658434A (en) * 2018-12-26 2019-04-19 成都纵横自动化技术股份有限公司 A kind of method and device of target following
CN109658434B (en) * 2018-12-26 2023-06-16 成都纵横自动化技术股份有限公司 Target tracking method and device
US11924538B2 (en) 2019-02-28 2024-03-05 Autel Robotics Co., Ltd. Target tracking method and apparatus and unmanned aerial vehicle
CN109828596A (en) * 2019-02-28 2019-05-31 深圳市道通智能航空技术有限公司 A kind of method for tracking target, device and unmanned plane
WO2020173463A1 (en) * 2019-02-28 2020-09-03 深圳市道通智能航空技术有限公司 Target tracking method and apparatus, and unmanned aerial vehicle
CN110418114A (en) * 2019-08-20 2019-11-05 京东方科技集团股份有限公司 A kind of method for tracing object, device, electronic equipment and storage medium
CN110418114B (en) * 2019-08-20 2021-11-16 京东方科技集团股份有限公司 Object tracking method and device, electronic equipment and storage medium
US11205276B2 (en) 2019-08-20 2021-12-21 Boe Technology Group Co., Ltd. Object tracking method, object tracking device, electronic device and storage medium
CN112922882A (en) * 2019-12-06 2021-06-08 佛山市云米电器科技有限公司 Fan control method, fan and computer readable storage medium
CN113674309A (en) * 2020-05-14 2021-11-19 杭州海康威视系统技术有限公司 Object tracking method, device, management platform and storage medium
CN113674309B (en) * 2020-05-14 2024-02-20 杭州海康威视系统技术有限公司 Method, device, management platform and storage medium for object tracking
CN111862154B (en) * 2020-07-13 2024-03-01 中移(杭州)信息技术有限公司 Robot vision tracking method and device, robot and storage medium
CN111862154A (en) * 2020-07-13 2020-10-30 中移(杭州)信息技术有限公司 Robot vision tracking method and device, robot and storage medium

Also Published As

Publication number Publication date
WO2018214909A1 (en) 2018-11-29
CN107255468B (en) 2019-11-19

Similar Documents

Publication Publication Date Title
CN107255468B (en) Method for tracking target, target following equipment and computer storage medium
CN107491742B (en) Long-term stable target tracking method for unmanned aerial vehicle
CN112102369B (en) Autonomous inspection method, device, equipment and storage medium for water surface floating target
CN104864889B (en) A kind of robot odometer correction system and method for view-based access control model
CN110136199A (en) A kind of vehicle location based on camera, the method and apparatus for building figure
CN107240124A (en) Across camera lens multi-object tracking method and device based on space-time restriction
CN105608417B (en) Traffic lights detection method and device
CN107688345B (en) Screen state automatic detecting machine device people, method and computer readable storage medium
US8699749B2 (en) Computer-readable storage medium, image processing apparatus, image processing system, and image processing method
US8625898B2 (en) Computer-readable storage medium, image recognition apparatus, image recognition system, and image recognition method
CN109145803A (en) Gesture identification method and device, electronic equipment, computer readable storage medium
CN106133649A (en) The eye gaze using binocular gaze constraint is followed the tracks of
US8571266B2 (en) Computer-readable storage medium, image processing apparatus, image processing system, and image processing method
CN107256561A (en) Method for tracking target and device
CN103870796A (en) Eye sight evaluation method and device
CN109829933A (en) Silhouette target method for tracing and device
CN103677274A (en) Interactive projection method and system based on active vision
US8718325B2 (en) Computer-readable storage medium, image processing apparatus, image processing system, and image processing method
CN104700408A (en) Indoor singe target positioning method based on camera network
CN104778465A (en) Target tracking method based on feature point matching
CN108830180A (en) Electronic check-in method, device and electronic equipment
CN109345513A (en) A kind of cigarette package defect inspection method with cigarette packet Attitude Calculation
CN106600652A (en) Panorama camera positioning method based on artificial neural network
CN105258680A (en) Object pose measurement method and device
CN111784775B (en) Identification-assisted visual inertia augmented reality registration method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant