CN108876806A - Method for tracking target and system, storage medium and equipment based on big data analysis - Google Patents
Method for tracking target and system, storage medium and equipment based on big data analysis Download PDFInfo
- Publication number
- CN108876806A CN108876806A CN201810427346.8A CN201810427346A CN108876806A CN 108876806 A CN108876806 A CN 108876806A CN 201810427346 A CN201810427346 A CN 201810427346A CN 108876806 A CN108876806 A CN 108876806A
- Authority
- CN
- China
- Prior art keywords
- probability density
- target tracking
- tracking area
- center coordinate
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 53
- 238000007405 data analysis Methods 0.000 title claims abstract description 42
- 230000006870 function Effects 0.000 claims description 96
- 238000004590 computer program Methods 0.000 claims description 20
- 238000004422 calculation algorithm Methods 0.000 claims description 16
- 238000004364 calculation method Methods 0.000 claims description 13
- 238000000605 extraction Methods 0.000 claims description 13
- 230000003068 static effect Effects 0.000 claims description 12
- 230000008859 change Effects 0.000 abstract description 5
- 230000008569 process Effects 0.000 description 12
- 238000010586 diagram Methods 0.000 description 7
- 238000005286 illumination Methods 0.000 description 5
- 238000004891 communication Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 239000003086 colorant Substances 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Analysis (AREA)
Abstract
The present invention relates to a kind of method for tracking target and system based on big data analysis,Computer storage medium and equipment,This method includes the first centre coordinate that the region is determined according to first image in the target following region of acquisition,The characteristic value collection of tracking object in the region is determined according to the intensity profile of the gray level image of the first image,Obtain the corresponding characteristic value in any position in the region,First probability density of the corresponding characteristic value in any position in characteristic value collection is calculated according to preset probability density function and the first centre coordinate,After prefixed time interval,Obtain second image in the region,Determine second centre coordinate in the region,Second probability density of the corresponding characteristic value in any position in characteristic value collection is calculated according to preset probability density function and the second centre coordinate,Tracking object is tracked according to the motion-vector that the first and second probability density determine.The program reduces influence of the color change to tracking during tracking, and stability is high.
Description
Technical Field
The invention relates to the technical field of image processing, in particular to a target tracking method and system based on big data analysis, a computing storage medium and equipment.
Background
In the technical field of image processing, the motion trail of a target object in a certain time can be determined by acquiring images of the target object in adjacent time slices, and the target object can be tracked according to the motion trail.
The currently adopted target tracking method is to track through color features, that is, the motion trajectory of a target object is determined according to the color features of the target object in an acquired image, and then the target object is tracked. For example, in the process of shooting the tracked object by the base station through the unmanned aerial vehicle, the motion trail of the tracked object relative to the unmanned aerial vehicle can be determined according to the change of the color feature of the tracked object in the image shot by the unmanned aerial vehicle, and the unmanned aerial vehicle tracks the tracked object according to the motion trail.
However, the currently adopted target tracking method performs tracking according to color characteristics, and changes in the color characteristics of a tracked object during tracking are easily affected by illumination, background colors and the like, so that the tracking stability is low.
Disclosure of Invention
In view of the above, it is necessary to provide a target tracking method and system, a computer storage medium, and a device based on big data analysis, aiming at the technical problem that the target tracking method has low stability.
A target tracking method based on big data analysis comprises the following steps:
acquiring a first image of a target tracking area, and determining a first center coordinate of the target tracking area according to the first image;
graying the first image to obtain a gray image of the target tracking area, and determining a characteristic value set of a tracked object in the target tracking area according to gray distribution of the gray image;
acquiring a characteristic value corresponding to any position in the target tracking area, and calculating a first probability density of the characteristic value corresponding to the any position in the characteristic value set according to a preset probability density function and the first center coordinate;
after a preset time interval, acquiring a second image of the target tracking area, determining a second center coordinate of the target tracking area according to the second image, and calculating a second probability density of a feature value corresponding to the arbitrary position in the feature value set according to the preset probability density function and the second center coordinate;
and determining a movement vector according to the first probability density and the second probability density, and tracking the tracking object according to the movement vector.
In one embodiment, before the step of acquiring the first image of the target tracking area, the method further includes: determining a target tracking area and a tracking object in the target tracking area according to preset tracking object characteristics and a preset characteristic extraction algorithm;
or;
and determining a target tracking area and a tracking object in the target tracking area according to the received selected instruction.
The target tracking area and the tracking object in the area are automatically determined according to the preset tracking object characteristics and the characteristic extraction algorithm, so that the target tracking area and the tracking object can be determined without manual operation, and the labor cost is reduced; the target tracking area and the tracking object in the area are directly determined through the received selected instruction, and the efficiency of determining the target tracking area and the tracking object is improved.
In one embodiment, before the step of obtaining the feature value corresponding to any position in the target tracking area, the method further includes:
setting a weight value of each position in the target tracking area, wherein the weight value of each position in the target tracking area is in direct proportion to the distance from each position to the center position of the target tracking area;
and correcting the preset probability density function according to the weight corresponding to each position in the target tracking area.
The weight of each position in the target tracking area is reversely set, namely the weight of different positions is in direct proportion to the distance from each position to the center of the target tracking area, so that the preset probability density function is corrected, and the accuracy and the stability of tracking the tracked object are improved.
In one embodiment, the step of obtaining a feature value corresponding to an arbitrary position in the target tracking area, and calculating a first probability density of the feature value corresponding to the arbitrary position in the set of feature values according to a preset probability density function and the first center coordinate includes:
acquiring a characteristic value corresponding to a frame position of the target tracking area, calculating a first probability density of the characteristic value corresponding to the frame position in the characteristic value set according to the corrected probability density function and the first center coordinate, acquiring a characteristic value corresponding to a first center coordinate of the target tracking area, calculating a third probability density of the characteristic value corresponding to the first center coordinate in the characteristic value set according to the corrected probability density function and the first center coordinate, and acquiring a first similarity of the first probability density and the third probability density;
the step of calculating a second probability density of the eigenvalue corresponding to the arbitrary position in the eigenvalue set according to the preset probability density function and the second center coordinate comprises:
calculating a second probability density of the characteristic value corresponding to the frame position in the characteristic value set according to the corrected probability density function and the second center coordinate, calculating a fourth probability density of the characteristic value corresponding to the second center coordinate in the characteristic value set according to the corrected probability density function and the second center coordinate, and obtaining a second similarity of the second probability density and the fourth probability density;
the determining a motion vector from the first probability density and the second probability density, the tracking the tracked object from the motion vector comprising:
and determining a motion vector according to the first similarity and the second similarity, and tracking the tracked object according to the motion vector.
The method comprises the steps of reversely setting the weight of each position in a target tracking area, correcting a probability density function, acquiring first and third probability densities of characteristic values corresponding to a frame position and a first center coordinate in a first image of the target tracking area in a characteristic value set of a tracked object, acquiring first similarity of the first and third probability densities, acquiring second and fourth probability densities of the characteristic values corresponding to a second center coordinate in a second image of the target tracking area in the characteristic value set, acquiring second similarity of the second and fourth probability densities, determining a motion vector according to the first and second similarity, and further tracking the tracked object, thereby reducing the influence of background characteristic information in the tracking area and fully capturing all effective characteristics of the tracked object in the tracking process, the accuracy and stability of tracking are improved.
In one embodiment, the determining a motion vector according to the first similarity and the second similarity, and the tracking the tracked object according to the motion vector includes:
obtaining a ratio of the first similarity to the second similarity, and determining a motion vector according to the ratio;
if the ratio is larger than a preset value, determining that the motion vector is a far vector, and tracking the tracked object according to the far vector;
if the ratio is equal to the preset value, determining that the motion vector is a static vector, and tracking the tracked object according to the static vector;
and if the ratio is smaller than the preset value, determining that the motion vector is a close vector, and tracking the tracked object according to the close vector.
By determining the ratio of the first similarity to the second similarity, the motion vector for tracking the tracked object is accurately determined according to the ratio, and the tracking stability is improved.
A big-data-analysis-based target tracking system, comprising:
the first acquisition module is used for acquiring a first image of a target tracking area and determining a first center coordinate of the target tracking area according to the first image;
the determining module is used for carrying out graying processing on the first image to obtain a gray image of the target tracking area and determining a characteristic value set of a tracked object in the target tracking area according to gray distribution of the gray image;
the calculation module is used for acquiring a characteristic value corresponding to any position in the target tracking area, and calculating a first probability density of the characteristic value corresponding to the any position in the characteristic value set according to a preset probability density function and the first center coordinate;
a second obtaining module, configured to obtain a second image of the target tracking area after a preset time interval, determine a second center coordinate of the target tracking area according to the second image, and calculate a second probability density of a feature value corresponding to the arbitrary position in the feature value set according to the preset probability density function and the second center coordinate;
and the tracking module is used for determining a movement vector according to the first probability density and the second probability density and tracking the tracked object according to the movement vector.
In one embodiment, the big data analysis-based target tracking system further includes:
the setting module is used for setting the weight of each position in the target tracking area before the calculation module obtains the characteristic value corresponding to any position in the target tracking area, wherein the weight of each position in the target tracking area is in direct proportion to the distance from each position to the center position of the target tracking area;
and the correcting module is used for correcting the preset probability density function according to the weight corresponding to each position in the target tracking area.
The weight of each position in the target tracking area is reversely set through the setting module, namely the weight of different positions is in direct proportion to the distance from each position to the center of the target tracking area, and then the preset probability density function is corrected through the correcting module, so that the accuracy and the stability of tracking the tracked object are improved.
In one embodiment, the calculation module is further configured to obtain a feature value corresponding to a border position of the target tracking area, calculate a first probability density of the feature value corresponding to the border position within the feature value set according to the modified probability density function and the first center coordinate, obtain a feature value corresponding to a first center coordinate of the target tracking area, calculate a third probability density of the feature value corresponding to the first center coordinate within the feature value set according to the modified probability density function and the first center coordinate, and obtain a first similarity between the first probability density and the third probability density;
the second obtaining module is further configured to obtain a second image of the target tracking area after a preset time interval, determine a second center coordinate in the target tracking area according to the second image, calculate a second probability density of a feature value corresponding to the frame position in the feature value set according to the modified probability density function and the second center coordinate, calculate a fourth probability density of the feature value corresponding to the second center coordinate in the feature value set according to the modified probability density function and the second center coordinate, and obtain a second similarity between the second probability density and the fourth probability density;
the tracking module is further configured to determine a motion vector according to the first similarity and the second similarity, and track the tracked object according to the motion vector.
The setting module and the modification module reversely set the weight of each position in the target tracking area, after the probability density function is modified, the calculation module is firstly utilized to obtain the first and the third probability densities of the characteristic value corresponding to the frame position and the first center coordinate in the first image of the target tracking area in the characteristic value set of the tracked object, the first similarity of the first and the third probability densities is obtained, then after a certain time, the second obtaining module is utilized to obtain the second and the fourth probability densities of the characteristic value corresponding to the second center coordinate in the frame position in the second image of the target tracking area in the characteristic value set, the second similarity of the second and the fourth probability densities is obtained, the tracking module is utilized to determine the movement vector according to the first and the second similarities, and then the tracked object is tracked, thereby in the tracking process, the influence of background characteristic information in a tracking area is reduced, all effective characteristics of a tracked object are fully captured, and the tracking accuracy and stability are improved.
A computer storage medium having stored thereon a computer program which, when executed by a processor, implements the big-data-analysis-based target tracking method.
A computer device comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein the processor executes the program to realize the target tracking method based on big data analysis.
The target tracking method and system based on big data analysis, the calculation storage medium and the device determine the first center coordinate of the region according to the acquired first image of the target tracking region, determine the feature value set of the tracked object in the region according to the gray image of the region, calculate the first probability density of the feature value corresponding to any position in the region in the feature value set according to the preset probability density function and the first center coordinate, further acquire the second image of the target tracking region after a certain time interval, determine the second center coordinate, calculate the second probability density of the feature value corresponding to any position in the region in the feature value set according to the preset probability density function and the second center coordinate, determine the movement vector according to the first and second probability densities, further track the tracked object according to the movement vector, thereby in the tracking process, the influence of the color characteristics of the tracked object, the illumination or the change of the background color on the tracking is reduced, and the tracking stability is improved.
Drawings
FIG. 1 is a diagram of an application environment of a big data analysis-based target tracking method according to an embodiment;
FIG. 2 is a flow diagram of a method for target tracking based on big data analysis according to an embodiment;
FIG. 3 is a flowchart of a target tracking method based on big data analysis according to another embodiment;
FIG. 4 is a schematic structural diagram of a target tracking system based on big data analysis according to an embodiment;
FIG. 5 is a schematic structural diagram of a target tracking system based on big data analysis according to another embodiment;
FIG. 6 is a diagram illustrating an internal structure of a computer device according to an embodiment.
Detailed Description
The technical solution of the present invention will be described in detail below with reference to specific embodiments and accompanying drawings to make it more clear.
Fig. 1 is a diagram illustrating an application environment of a target tracking method based on big data analysis according to an embodiment. The target tracking method based on big data analysis can be applied to an unmanned aerial vehicle system, and then the target is automatically tracked through the unmanned aerial vehicle system. As shown in fig. 1, the unmanned aerial vehicle 10 includes a flying device 101, a camera 102, a processor 103, a memory 104, and a communication device 105, where the processor 103 is respectively connected to the flying device 101, the camera 102, the memory 104, and the communication device 105, the flying device 101 is configured to implement automatic flight of the unmanned aerial vehicle 10, the camera 102 is configured to acquire an image, that is, an image of a target tracking area is acquired during a flight of the unmanned aerial vehicle 10, the processor 103 is configured to process the acquired image, including performing image and data processing, so as to acquire data related to the target tracking area and a tracking object, and the acquired data may be stored in the memory 104 of the unmanned aerial vehicle 10, or transmitted to a data receiver through the communication device 105.
Fig. 2 is a schematic flow chart of a target tracking method based on big data analysis according to an embodiment, where the method includes:
step S201: acquiring a first image of a target tracking area, and determining a first center coordinate of the target tracking area according to the first image;
step S202: graying the first image to obtain a gray image of the target tracking area, and determining a characteristic value set of a tracked object in the target tracking area according to gray distribution of the gray image;
step S203: acquiring a characteristic value corresponding to any position in the target tracking area, and calculating a first probability density of the characteristic value corresponding to the any position in the characteristic value set according to a preset probability density function and the first center coordinate;
step S204: after a preset time interval, acquiring a second image of the target tracking area, determining a second center coordinate of the target tracking area according to the second image, and calculating a second probability density of a feature value corresponding to the arbitrary position in the feature value set according to the preset probability density function and the second center coordinate;
step S205: and determining a movement vector according to the first probability density and the second probability density, and tracking the tracking object according to the movement vector.
The target tracking method based on big data analysis comprises the steps of determining a first center coordinate of a region according to an acquired first image of a target tracking region, determining a characteristic value set of a tracked object in the region according to a gray image of the region, calculating a first probability density of a characteristic value corresponding to any position in the region in the characteristic value set according to a preset probability density function and the first center coordinate, acquiring a second image of the target tracking region after a certain time interval, determining a second center coordinate, calculating a second probability density of the characteristic value corresponding to any position in the region in the characteristic value set according to the preset probability density function and the second center coordinate, determining a movement vector according to the first probability density and the second probability density, and tracking the tracked object according to the movement vector, so that the color characteristic of the tracked object is reduced in the tracking process, The tracking stability is improved due to the influence of the change of illumination or background color on the tracking.
In the practical application process, the target tracking method based on big data analysis can be applied to an unmanned aerial vehicle system, videos or images of the tracked object are obtained through shooting of the unmanned aerial vehicle, and then the tracked object is tracked according to the obtained images. For convenience of description, the following description will take the example that the target tracking method based on big data analysis is applied to the unmanned aerial vehicle system as an example.
Before step S201, determining a target tracking area and a tracking object in the target tracking area according to a preset tracking object feature and a preset feature extraction algorithm may further be included. The method comprises the steps that a Scale-invariant feature transform (SIFT) algorithm can be adopted for feature detection and selection, namely, a target tracking area and a tracking object are automatically determined through the SIFT algorithm and the preset features of the tracking object, the features of the tracking object comprise the features of the tracking object and the size of the tracking area where the tracking object is located, the tracking object in the target tracking area and the tracking object in the area are automatically determined according to the preset features of the tracking object and the feature extraction algorithm, the target tracking area and the tracking object can be determined without manual operation, and labor cost is reduced; similarly, the target tracking area and the tracking object in the target tracking area can be determined according to the received selected instruction, and compared with the automatic determination through a feature extraction algorithm, the efficiency of determining the target tracking area and the tracking object is improved.
In one embodiment, a target tracking object is tracked by an unmanned aerial vehicle, first, the unmanned aerial vehicle acquires a current image of a target tracking area, the target tracking area is a tracking frame of the unmanned aerial vehicle, the area in the tracking frame is a first area, the unmanned aerial vehicle acquires a characteristic value set R of the tracking object according to gray distribution in the tracking frame, and a central coordinate point of the tracking frame is calculated to be y0Therefore, the probability density of the characteristic value mu epsilon R corresponding to any position in the tracking frameThe calculation equation of (a) is:
where k (x) represents the contour function of the kernel function, weighted for different positions within the tracking frame, y from the center of the tracking frame0Closer points are weighted more heavily than points that are relatively farther away. x is the number ofiRepresenting each pixel point within the tracking box, h represents the width of the kernel function, δ [ b (x)i)-μ]For determining point xiWhether the gray value of (1) meets the characteristic value of the mu-th interval, n is the total number of pixel points in the tracking frame, and C is normalizationA constant. If any point of the tracking frame meets the characteristic value, the similarity is 1, namely the tracking frame is completely similar.
After a preset time interval, the unmanned aerial vehicle acquires a new image of the target tracking area, takes other points selected in an area contained in the kernel function width in the new image as new coordinates, takes an area corresponding to a tracking frame with the new coordinates as central coordinates at the moment as a second area, and calculates a new probability density that a feature value at any position in the second area determined by the new central coordinates belongs to the feature value set by applying the equation (1). Then, the probability densities are compared using the Babbitt coefficientSimilarity to said new probability densityI.e. the degree of similarity of the first area to the second areaThe following were used:
wherein,is the probability density of the second region with a characteristic value of mu,is the probability density of the first region with the characteristic value of μ. By combining the above two equations (1) and (2), theThe Taylor expansion equation for the points is:
the left-hand calculation of the plus sign in equation (3) is independent of the value of y, so only the right-hand equation function f is neededkThe following were used:
for f is to fkAnd calculating to obtain a movement vector, wherein the movement vector is a drift vector which the unmanned aerial vehicle should do, and the target area center coordinate of the image after moving is obtained. Tracking the tracking object by the unmanned aerial vehicle in the embodiment, reducing the influence of the color characteristics, illumination or background color of the tracking object on tracking, and improving the tracking stability.
When the tracking object is tracked according to the tracking algorithm, the width of the kernel function is fixed, so that the size of the tracking frame is fixed, and therefore, if the tracking object is closer to the unmanned aerial vehicle in the shooting and tracking process of the unmanned aerial vehicle, namely, the image of the tracking object in the video shot by the unmanned aerial vehicle is gradually enlarged, and the width values of the tracking frame and the kernel function are not changed, the characteristics of the tracking object in the tracking frame are insufficient, and a part or most of effective characteristics cannot be captured, so that the tracking is inaccurate or even the tracking fails; in addition, when the images in the video shot by the unmanned aerial vehicle are gradually reduced, more background feature information is mixed in the tracking frame, so that the proportion of the features of the tracked object is gradually reduced, the similarity between the changed tracking frame and the original tracking frame is also smaller and smaller, and the tracking failure is easily caused when interference occurs. Therefore, it is necessary to improve the tracking accuracy of the tracking process and prevent the tracking failure.
Before step S203, a weight of each position in the target tracking area may be set, where the size of the weight of each position in the target tracking area is proportional to a distance from each position to a center position of the target tracking area, and the preset probability density function is modified according to the weight corresponding to each position in the target tracking area. Compared with the traditional weight setting mode, the inertia thinking is broken through, reverse weight setting is adopted, and the weights of all positions in the target tracking area are reversely set, namely the weights of different positions are in direct proportion to the distance between each position and the center of the target tracking area, the weight of the position closer to the center of the target tracking area is smaller, so that the preset probability density function is corrected, the influence of background characteristic information in the tracking area on the weights is reduced, the accuracy of capturing effective characteristics of a tracked object is improved, and the accuracy and the stability of tracking the tracked object are improved.
Further, after the preset probability density function is modified, step S203 may further include obtaining a feature value corresponding to a frame position of the target tracking area, calculating a first probability density of the feature value corresponding to the frame position in the feature value set of the tracked object according to the modified probability density function and the first center coordinate, obtaining a feature value corresponding to the first center coordinate of the target tracking area, calculating a third probability density of the feature value corresponding to the first center coordinate in the feature value set according to the modified probability density function and the first center coordinate, obtaining a first similarity of the first probability density and the third probability density, further step S204 may further include calculating a second probability density of the feature value corresponding to the frame position in the feature value set of the tracked object according to the modified probability density function and the second center coordinate, and calculating a second probability density of the feature value corresponding to the second center coordinate in the tracking according to the modified probability density function and the second center coordinate Obtaining a fourth probability density in the characteristic value set of the object, and obtaining a second similarity of the second probability density and the fourth probability density; step S205 may further include determining a motion vector according to the first similarity and the second similarity, and tracking the tracking object according to the motion vector. After the probability density function is corrected by reversely setting the weight, the weight difference between the frame position and the center position in the target tracking area is utilized, so that all effective characteristics of the tracked object are fully captured, the influence generated by background characteristic information in the tracking area is reduced, the tracking accuracy and stability are improved, and the tracking loss is prevented.
In one embodiment, when the unmanned aerial vehicle tracks the tracked object, a new weight division mechanism is adopted to track the target tracked object, that is, the closer the position or the pixel point to the center of the target tracking area is set, the smaller the weight is, and the closer the position or the pixel point to the frame of the target tracking area is set, the larger the weight is. Correcting the kernel function according to the weight dividing result, and defining a new kernel function equation as G (x) as follows:
substituting equation (6) into a preset probability density calculation equation, namely equation (1), to correct, obtaining a calculation equation of a corrected probability density function, and calculating the probability density of a characteristic value belonging to a characteristic value set at a first boundary of a tracking frameThen, a characteristic value corresponding to the central coordinate of the target tracking area is obtained, and the probability density of the characteristic value at the central coordinate is calculatedAnd (3) carrying out similarity calculation of the Babbitt coefficient to obtain a first similarity, wherein an equation of the first similarity can be recorded as:
and then after a preset time interval, the unmanned aerial vehicle acquires a new image of the target tracking area, and then the probability density of the characteristic value at the second boundary of the tracking frame after the unmanned aerial vehicle moves and a new central coordinate are repeatedly calculated according to the equationProcessing probability density of characteristic value, and calculating similarity to obtain p2. Further according to p1And p2And determining a movement vector and tracking the tracked object.
Specifically, p can be acquired1And p2Determining a motion vector according to the comparison between the ratio lambda and a preset ratio, and tracking the tracked object. Wherein, the preset ratio can be 1, and the ratio of the similarity should be greater than 0, so that when lambda is greater than>1, namely the similarity between the characteristic value at the boundary of the tracking frame and the center of the target is higher than that at the boundary of the new tracking frame, which indicates that the size of the target is larger than that of the tracking frame, so that the movement vector is a far vector, and the unmanned aerial vehicle is far away from the tracking object according to the far vector, so that the size of the tracking frame is increased; when 0 is present<λ<1, the feature value at the boundary of the tracking frame is higher than the target feature in different degrees, so that the motion vector is a close vector, and the unmanned aerial vehicle closes to the tracked object according to the close vector, and the size of the tracking frame is reduced; in addition, when λ is 1, then the motion vector should be a static vector, and the drone remains the same, without changing the tracking frame size.
By determining the ratio of the first similarity to the second similarity, the motion vector for tracking the tracked object is accurately determined according to the ratio, and the tracking stability is improved.
To make the technical solution of the present invention clearer, a flow chart of a target tracking method based on big data analysis according to an embodiment shown in fig. 3 is provided, where the method may include:
step S301: determining a tracking area and a tracking object in the area according to the received selected instruction;
step S302: acquiring a first image of a tracking area, and determining a first center coordinate of the tracking area according to the first image;
step S303: graying the first image to obtain a gray image of a tracking area, and determining a characteristic value set of a tracking object in the area according to gray distribution of the gray image;
step S304: setting the weight value of each position in the tracking area to be in direct proportion to the distance from each position to the center position of the tracking area, and acquiring a modified probability density function according to the weight value of each position in the tracking area; the position weight value of the tracking area which is farther from the center coordinate is larger;
step S305: calculating a first probability of a characteristic value corresponding to the frame position of the tracking area in the characteristic value set according to the corrected probability density function and the first center coordinate;
step S306: calculating a third probability of the characteristic value corresponding to the first center coordinate in the characteristic value set according to the corrected probability density function and the first center coordinate, and acquiring a first similarity of the first probability density and the third probability density;
step S307: after a preset time interval, acquiring a second image of the tracking area, determining a second center coordinate of the tracking area according to the second image, and calculating a second probability density of the characteristic value corresponding to the frame position in the characteristic value set according to the corrected probability density function and the second center coordinate;
step S308: calculating a fourth probability density of the characteristic value corresponding to the second center coordinate in the characteristic value set according to the corrected probability density function and the second center coordinate, and obtaining a second similarity of the second probability density and the fourth probability density;
step S309: and determining a motion vector according to the first similarity and the second similarity, and tracking the tracked object according to the motion vector. According to the target tracking method based on big data analysis, the weight of each position in the tracking area is reversely set, the probability density function is corrected, and the weight difference between the frame position and the center position in the tracking area is utilized, so that the influence generated by background characteristic information in the tracking area is reduced, all effective characteristics of a tracked object are fully captured, and the tracking accuracy and stability are improved.
For the technical problem of low stability of the current target tracking technology, it is necessary to provide a target tracking system based on big data analysis, as shown in fig. 4, the system includes:
a first obtaining module 401, configured to obtain a first image of a target tracking area, and determine a first center coordinate of the target tracking area according to the first image;
a determining module 402, configured to perform graying processing on the first image to obtain a grayscale image of the target tracking area, and determine a feature value set of a tracked object in the target tracking area according to a grayscale distribution of the grayscale image;
a calculating module 403, configured to obtain a feature value corresponding to an arbitrary position in the target tracking area, and calculate a first probability density of the feature value corresponding to the arbitrary position in the feature value set according to a preset probability density function and the first center coordinate;
a second obtaining module 404, configured to obtain a second image of the target tracking area after a preset time interval, determine a second center coordinate of the target tracking area according to the second image, and calculate a second probability density of a feature value corresponding to the arbitrary position in the feature value set according to the preset probability density function and the second center coordinate;
a tracking module 405, configured to determine a motion vector according to the first probability density and the second probability density, and track the tracked object according to the motion vector.
The target tracking system based on big data analysis determines a first center coordinate of a region according to a first image of a target tracking region acquired by a first acquisition module 401, determines a feature value set of a tracked object in the region according to a determination module 402, calculates a first probability density of a feature value corresponding to any position in the region in the feature value set according to a preset probability density function and the first center coordinate by a calculation module 403, further acquires a second image of the target tracking region by a second acquisition module 404 after a certain time interval, determines a second center coordinate, calculates a second probability density of the feature value corresponding to any position in the region in the feature value set according to the preset probability density function and the second center coordinate by a tracking module 405, determines a motion vector according to the first and second probability densities, and further tracks the tracked object according to the motion vector, therefore, in the tracking process, the influence of the color characteristics of the tracked object, illumination or background color change on tracking is reduced, and the tracking stability is improved.
The first obtaining module 401 may further determine the target tracking area and the tracked object in the target tracking area according to a preset tracked object feature and a preset feature extraction algorithm. The target tracking area and the target in the target tracking area are automatically determined according to the preset tracking object characteristics and the characteristic extraction algorithm, the target tracking area and the target can be determined without manual operation, and the labor cost is reduced; similarly, the first obtaining module 401 may further determine the target tracking area and the tracked object in the target tracking area according to the received selected instruction, and improve the efficiency of determining the target tracking area and the tracked object compared with performing automatic determination through a feature extraction algorithm.
In an embodiment as shown in fig. 5, the target tracking system based on big data analysis may further include a setting module 406 and a modifying module 407, where the setting module 406 is configured to set a weight of each position in the target tracking area, the weight of each position in the target tracking area is proportional to a distance from each position to a center position of the target tracking area, and the modifying module 407 is configured to modify the preset probability density function according to the weight corresponding to each position in the target tracking area. Compared with the traditional weight setting mode, the target tracking system based on big data analysis breaks through the inertia thinking, the weight of each position in the target tracking area is reversely set through the setting module 406, namely the weight of different positions is in direct proportion to the distance between each position and the center of the target tracking area, the weight of the position closer to the center of the target tracking area is smaller, the preset probability density function is corrected by the correcting module 407, the influence of background characteristic information in the tracking area on the weight is reduced, the accuracy of capturing the effective characteristics of the tracked object is improved, and the accuracy and the stability of tracking the tracked object are improved.
Further, after the preset probability density function is modified, the calculating module 403 may further obtain a feature value corresponding to a frame position of the target tracking area, calculate a first probability density of the feature value corresponding to the frame position in the feature value set of the tracked object according to the modified probability density function and the first center coordinate, obtain a feature value corresponding to the first center coordinate of the target tracking area, calculate a third probability density of the feature value corresponding to the first center coordinate in the feature value set according to the modified probability density function and the first center coordinate, obtain a first similarity of the first probability density and the third probability density, calculate a second probability density of the feature value corresponding to the frame position in the feature value set of the tracked object according to the modified probability density function and the second center coordinate, and calculate a second probability density of the feature value corresponding to the second center coordinate in the tracked object according to the modified probability density function and the second center coordinate Obtaining a fourth probability density in the characteristic value set of the object, and obtaining a second similarity of the second probability density and the fourth probability density; the tracking module 405 may determine a motion vector according to the first similarity and the second similarity, and track the tracked object according to the motion vector. The weight is reversely set and the probability density function is corrected through the setting module 406 and the correcting module 407, and all effective characteristics of the tracked object are fully captured through the calculating module 403, the second obtaining module 404 and the tracking module 405 by utilizing the weight difference between the frame position and the center position in the target tracking area, so that the influence generated by background characteristic information in the tracking area is reduced, the tracking accuracy and stability are improved, and the tracking loss is prevented.
When tracking the tracked object, the tracking module 405 may obtain a ratio of the first similarity to the second similarity, and determine a motion vector according to the ratio, thereby implementing tracking. If the ratio is larger than a preset value, determining that the motion vector is a far vector, and tracking the tracked object according to the far vector; if the ratio is equal to a preset value, determining that the motion vector is a static vector, and tracking the tracked object according to the static vector; otherwise, determining the motion vector as a close vector, and tracking the tracked object according to the close vector. By determining the ratio of the first similarity to the second similarity, the motion vector for tracking the tracked object is accurately determined according to the ratio, and the tracking stability is improved.
For specific limitations of the target tracking system based on big data analysis, reference may be made to the above limitations of the target tracking method based on big data analysis, and details are not repeated here. The modules in the target tracking system based on big data analysis can be wholly or partially realized by software, hardware and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a computer device is provided, which may be a terminal, and its internal structure diagram may be as shown in fig. 6. The computer device includes a processor, a memory, and a network interface connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer apparatus includes a nonvolatile storage medium, an internal memory, a network interface, a camera, and an input device. The non-volatile storage medium stores an operating system and a computer program, the internal memory provides an environment for the operation of the operating system and the computer program in the non-volatile storage medium, and a network interface of the computer device is used for connecting and communicating with an external terminal through a network, wherein an image of a target is acquired through the camera device, and an input instruction and data are received through the input device. The computer program is executed by a processor to implement a big data analysis based target tracking method.
In one embodiment, a computer device is provided, comprising a memory and a processor, the memory having a computer program stored therein, the processor implementing the following steps when executing the computer program:
acquiring a first image of a target tracking area, and determining a first center coordinate of the target tracking area according to the first image;
graying the first image to obtain a gray image of the target tracking area, and determining a characteristic value set of a tracked object in the target tracking area according to gray distribution of the gray image;
acquiring a characteristic value corresponding to any position in the target tracking area, and calculating a first probability density of the characteristic value corresponding to the any position in the characteristic value set according to a preset probability density function and the first center coordinate;
after a preset time interval, acquiring a second image of the target tracking area, determining a second center coordinate of the target tracking area according to the second image, and calculating a second probability density of a feature value corresponding to the arbitrary position in the feature value set according to the preset probability density function and the second center coordinate;
and determining a movement vector according to the first probability density and the second probability density, and tracking the tracking object according to the movement vector.
In one embodiment, the processor when executing the computer program further performs the following steps, including determining a target tracking area and a tracking object within the target tracking area according to preset tracking object features and a preset feature extraction algorithm; or determining a target tracking area and a tracking object in the target tracking area according to the received selected instruction. The target tracking area and the tracking object in the area are automatically determined according to the preset tracking object characteristics and the characteristic extraction algorithm, so that the target tracking area and the tracking object can be determined without manual operation, and the labor cost is reduced; the target tracking area and the tracking object in the area are directly determined through the received selected instruction, and the efficiency of determining the target tracking area and the tracking object is improved.
In an embodiment, the processor further implements the following steps when executing the computer program, including setting a weight of each position in the target tracking area, where a size of the weight of each position in the target tracking area is proportional to a distance from each position to a center position of the target tracking area, and modifying the preset probability density function according to the weight corresponding to each position in the target tracking area. The weight of each position in the target tracking area is reversely set, namely the weight of different positions is in direct proportion to the distance from each position to the center of the target tracking area, so that the preset probability density function is corrected, and the accuracy and the stability of tracking the tracked object are improved.
In one embodiment, the processor when executing the computer program further performs the following steps, including obtaining a feature value corresponding to a border position of the target tracking area, calculating a first probability density of the feature value corresponding to the border position within the feature value set according to the modified probability density function and the first center coordinate, obtaining a feature value corresponding to a first center coordinate of the target tracking area, calculating a third probability density of the feature value corresponding to the first center coordinate within the feature value set according to the modified probability density function and the first center coordinate, and obtaining a first similarity of the first probability density and the third probability density; after a preset time interval, acquiring a second image of the target tracking area, determining a second center coordinate of the target tracking area according to the second image, calculating a second probability density of a feature value corresponding to the frame position in the feature value set according to the modified probability density function and the second center coordinate, calculating a fourth probability density of the feature value corresponding to the second center coordinate in the feature value set according to the modified probability density function and the second center coordinate, and acquiring a second similarity of the second probability density and the fourth probability density; and determining a motion vector according to the first similarity and the second similarity, and tracking the tracked object according to the motion vector. After the probability density function is corrected by reversely setting the weight, the weight difference between the frame position and the center position in the target tracking area is utilized, so that all effective characteristics of the tracked object are fully captured, the influence generated by background characteristic information in the tracking area is reduced, the tracking accuracy and stability are improved, and the tracking loss is prevented.
In one embodiment, the processor further implements the following steps when executing the computer program, including obtaining a ratio of the first similarity to the second similarity, determining a motion vector according to the ratio, determining that the motion vector is a far vector if the ratio is greater than a preset value, tracking the tracked object according to the far vector, determining that the motion vector is a static vector if the ratio is equal to the preset value, tracking the tracked object according to the static vector, determining that the motion vector is a near vector if the ratio is less than the preset value, and tracking the tracked object according to the near vector. By determining the ratio of the first similarity to the second similarity, the motion vector for tracking the tracked object is accurately determined according to the ratio, and the tracking stability is improved.
In one embodiment, a computer-readable storage medium is provided, having a computer program stored thereon, which when executed by a processor, performs the steps of:
acquiring a first image of a target tracking area, and determining a first center coordinate of the target tracking area according to the first image;
graying the first image to obtain a gray image of the target tracking area, and determining a characteristic value set of a tracked object in the target tracking area according to gray distribution of the gray image;
acquiring a characteristic value corresponding to any position in the target tracking area, and calculating a first probability density of the characteristic value corresponding to the any position in the characteristic value set according to a preset probability density function and the first center coordinate;
after a preset time interval, acquiring a second image of the target tracking area, determining a second center coordinate of the target tracking area according to the second image, and calculating a second probability density of a feature value corresponding to the arbitrary position in the feature value set according to the preset probability density function and the second center coordinate;
and determining a movement vector according to the first probability density and the second probability density, and tracking the tracking object according to the movement vector.
In one embodiment, the computer program when executed by the processor further performs the steps comprising determining a target tracking area and a tracked object within the target tracking area according to preset tracked object features and a preset feature extraction algorithm; or determining a target tracking area and a tracking object in the target tracking area according to the received selected instruction. The target tracking area and the tracking object in the area are automatically determined according to the preset tracking object characteristics and the characteristic extraction algorithm, so that the target tracking area and the tracking object can be determined without manual operation, and the labor cost is reduced; the target tracking area and the tracking object in the area are directly determined through the received selected instruction, and the efficiency of determining the target tracking area and the tracking object is improved.
In one embodiment, the computer program further implements the following steps when executed by the processor, including setting a weight value of each position in the target tracking area, where the weight value of each position in the target tracking area is proportional to a distance from each position to a center position of the target tracking area, and modifying the preset probability density function according to the weight value corresponding to each position in the target tracking area. The weight of each position in the target tracking area is reversely set, namely the weight of different positions is in direct proportion to the distance from each position to the center of the target tracking area, so that the preset probability density function is corrected, and the accuracy and the stability of tracking the tracked object are improved.
In one embodiment, the computer program when executed by the processor further performs the steps of obtaining feature values corresponding to border positions of the target tracking area, calculating a first probability density of the feature values corresponding to the border positions within the feature value set according to the modified probability density function and the first center coordinate, obtaining feature values corresponding to first center coordinates of the target tracking area, calculating a third probability density of the feature values corresponding to the first center coordinates within the feature value set according to the modified probability density function and the first center coordinate, and obtaining a first similarity of the first probability density and the third probability density; after a preset time interval, acquiring a second image of the target tracking area, determining a second center coordinate of the target tracking area according to the second image, calculating a second probability density of a feature value corresponding to the frame position in the feature value set according to the modified probability density function and the second center coordinate, calculating a fourth probability density of the feature value corresponding to the second center coordinate in the feature value set according to the modified probability density function and the second center coordinate, and acquiring a second similarity of the second probability density and the fourth probability density; and determining a motion vector according to the first similarity and the second similarity, and tracking the tracked object according to the motion vector. After the probability density function is corrected by reversely setting the weight, the weight difference between the frame position and the center position in the target tracking area is utilized, so that all effective characteristics of the tracked object are fully captured, the influence generated by background characteristic information in the tracking area is reduced, the tracking accuracy and stability are improved, and the tracking loss is prevented.
In one embodiment, the computer program when executed by the processor further performs the following steps, including obtaining a ratio of the first similarity to the second similarity, determining a motion vector according to the ratio, determining that the motion vector is a far vector if the ratio is greater than a preset value, tracking the tracked object according to the far vector, determining that the motion vector is a static vector if the ratio is equal to the preset value, tracking the tracked object according to the static vector, and otherwise determining that the motion vector is a near vector, and tracking the tracked object according to the near vector. By determining the ratio of the first similarity to the second similarity, the motion vector for tracking the tracked object is accurately determined according to the ratio, and the tracking stability is improved.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus (Rambus) direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
It should be noted that the terms "first \ second \ third \ fourth" related to the embodiments of the present invention are merely used for distinguishing similar objects, and do not represent a specific ordering for the objects, and it should be understood that "first \ second \ third \ fourth" may exchange a specific order or sequence order if allowed. It should be understood that the terms first, second, third and fourth, etc. used herein are interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in other sequences than those illustrated or otherwise described herein.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present invention, and the description thereof is specific, but not construed as limiting the scope of the present invention. It should be noted that, for a person skilled in the art, many variations and modifications can be made without departing from the spirit of the invention, which falls within the scope of the invention. Therefore, the protection scope of the present patent shall be subject to the appended claims.
Claims (10)
1. A target tracking method based on big data analysis is characterized by comprising the following steps:
acquiring a first image of a target tracking area, and determining a first center coordinate of the target tracking area according to the first image;
graying the first image to obtain a gray image of the target tracking area, and determining a characteristic value set of a tracked object in the target tracking area according to gray distribution of the gray image;
acquiring a characteristic value corresponding to any position in the target tracking area, and calculating a first probability density of the characteristic value corresponding to the any position in the characteristic value set according to a preset probability density function and the first center coordinate;
after a preset time interval, acquiring a second image of the target tracking area, determining a second center coordinate of the target tracking area according to the second image, and calculating a second probability density of a feature value corresponding to the arbitrary position in the feature value set according to the preset probability density function and the second center coordinate;
and determining a movement vector according to the first probability density and the second probability density, and tracking the tracking object according to the movement vector.
2. The big data analysis-based target tracking method according to claim 1, further comprising, before the step of acquiring the first image of the target tracking area:
determining a target tracking area and a tracking object in the target tracking area according to preset tracking object characteristics and a preset characteristic extraction algorithm;
or;
and determining a target tracking area and a tracking object in the target tracking area according to the received selected instruction.
3. The big data analysis-based target tracking method according to any one of claims 1 to 2, further comprising, before the step of obtaining the feature value corresponding to any position in the target tracking area:
setting a weight value of each position in the target tracking area, wherein the weight value of each position in the target tracking area is in direct proportion to the distance from each position to the center position of the target tracking area;
and correcting the preset probability density function according to the weight corresponding to each position in the target tracking area.
4. The big data analysis-based target tracking method according to claim 3, wherein the step of obtaining the feature value corresponding to any position in the target tracking area, and calculating a first probability density of the feature value corresponding to the any position in the set of feature values according to a preset probability density function and the first center coordinate comprises:
acquiring a characteristic value corresponding to a frame position of the target tracking area, calculating a first probability density of the characteristic value corresponding to the frame position in the characteristic value set according to the corrected probability density function and the first center coordinate, acquiring a characteristic value corresponding to a first center coordinate of the target tracking area, calculating a third probability density of the characteristic value corresponding to the first center coordinate in the characteristic value set according to the corrected probability density function and the first center coordinate, and acquiring a first similarity of the first probability density and the third probability density;
the step of calculating a second probability density of the eigenvalue corresponding to the arbitrary position in the eigenvalue set according to the preset probability density function and the second center coordinate comprises:
calculating a second probability density of the characteristic value corresponding to the frame position in the characteristic value set according to the corrected probability density function and the second center coordinate, calculating a fourth probability density of the characteristic value corresponding to the second center coordinate in the characteristic value set according to the corrected probability density function and the second center coordinate, and obtaining a second similarity of the second probability density and the fourth probability density;
the determining a motion vector from the first probability density and the second probability density, the tracking the tracked object from the motion vector comprising:
and determining a motion vector according to the first similarity and the second similarity, and tracking the tracked object according to the motion vector.
5. The big data analysis-based target tracking method according to claim 4, wherein the determining a motion vector according to the first similarity and the second similarity, and the tracking the tracked object according to the motion vector comprises:
obtaining a ratio of the first similarity to the second similarity, and determining a motion vector according to the ratio;
if the ratio is larger than a preset value, determining that the motion vector is a far vector, and tracking the tracked object according to the far vector;
if the ratio is equal to the preset value, determining that the motion vector is a static vector, and tracking the tracked object according to the static vector;
and if the ratio is smaller than the preset value, determining that the motion vector is a close vector, and tracking the tracked object according to the close vector.
6. A big data analysis-based target tracking system, the system comprising:
the first acquisition module is used for acquiring a first image of a target tracking area and determining a first center coordinate of the target tracking area according to the first image;
the determining module is used for carrying out graying processing on the first image to obtain a gray image of the target tracking area and determining a characteristic value set of a tracked object in the target tracking area according to gray distribution of the gray image;
the calculation module is used for acquiring a characteristic value corresponding to any position in the target tracking area, and calculating a first probability density of the characteristic value corresponding to the any position in the characteristic value set according to a preset probability density function and the first center coordinate;
a second obtaining module, configured to obtain a second image of the target tracking area after a preset time interval, determine a second center coordinate of the target tracking area according to the second image, and calculate a second probability density of a feature value corresponding to the arbitrary position in the feature value set according to the preset probability density function and the second center coordinate;
and the tracking module is used for determining a movement vector according to the first probability density and the second probability density and tracking the tracked object according to the movement vector.
7. The big-data-analysis-based target tracking system of claim 6, further comprising:
the setting module is used for setting the weight of each position in the target tracking area before the calculation module obtains the characteristic value corresponding to any position in the target tracking area, wherein the weight of each position in the target tracking area is in direct proportion to the distance from each position to the center position of the target tracking area;
and the correcting module is used for correcting the preset probability density function according to the weight corresponding to each position in the target tracking area.
8. The big data analysis-based target tracking system according to claim 7, wherein the computing module is further configured to obtain feature values corresponding to border positions of the target tracking area, calculate a first probability density of the feature values corresponding to the border positions within the feature value set according to the modified probability density function and the first center coordinate, obtain feature values corresponding to the first center coordinate of the target tracking area, calculate a third probability density of the feature values corresponding to the first center coordinate within the feature value set according to the modified probability density function and the first center coordinate, and obtain a first similarity between the first probability density and the third probability density;
the second obtaining module is further configured to obtain a second image of the target tracking area after a preset time interval, determine a second center coordinate in the target tracking area according to the second image, calculate a second probability density of a feature value corresponding to the frame position in the feature value set according to the modified probability density function and the second center coordinate, calculate a fourth probability density of the feature value corresponding to the second center coordinate in the feature value set according to the modified probability density function and the second center coordinate, and obtain a second similarity between the second probability density and the fourth probability density;
the tracking module is further configured to determine a motion vector according to the first similarity and the second similarity, and track the tracked object according to the motion vector.
9. A computer storage medium having a computer program stored thereon, wherein the computer program, when executed by a processor, implements the big data analysis based object tracking method of any of claims 1 to 5.
10. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the big data analysis based object tracking method of any one of claims 1 to 5 when executing the program.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810427346.8A CN108876806A (en) | 2018-05-07 | 2018-05-07 | Method for tracking target and system, storage medium and equipment based on big data analysis |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810427346.8A CN108876806A (en) | 2018-05-07 | 2018-05-07 | Method for tracking target and system, storage medium and equipment based on big data analysis |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108876806A true CN108876806A (en) | 2018-11-23 |
Family
ID=64327467
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810427346.8A Pending CN108876806A (en) | 2018-05-07 | 2018-05-07 | Method for tracking target and system, storage medium and equipment based on big data analysis |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108876806A (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110177256A (en) * | 2019-06-17 | 2019-08-27 | 北京影谱科技股份有限公司 | A kind of tracking video data acquisition methods and device |
CN111314884A (en) * | 2020-02-25 | 2020-06-19 | 青海省地质测绘地理信息院 | Method and system for transmitting remote sensing image information of unmanned aerial vehicle at high speed |
CN111464945A (en) * | 2020-04-07 | 2020-07-28 | 广州起妙科技有限公司 | Positioning method and system of terminal equipment |
CN113495270A (en) * | 2020-04-07 | 2021-10-12 | 富士通株式会社 | Monitoring device and method based on microwave radar |
CN117250996A (en) * | 2023-11-20 | 2023-12-19 | 中国人民解放军海军工程大学 | Method for searching movable target by unmanned cluster |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070165906A1 (en) * | 2000-12-19 | 2007-07-19 | Lockheed Martin Corporation | Fast fourier transform correlation tracking algorithm with background correction |
CN101141633A (en) * | 2007-08-28 | 2008-03-12 | 湖南大学 | Moving object detecting and tracing method in complex scene |
CN101324956A (en) * | 2008-07-10 | 2008-12-17 | 上海交通大学 | Method for tracking anti-shield movement object based on average value wander |
CN101493944A (en) * | 2009-03-06 | 2009-07-29 | 北京中星微电子有限公司 | Moving target detecting and tracking method and system |
-
2018
- 2018-05-07 CN CN201810427346.8A patent/CN108876806A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070165906A1 (en) * | 2000-12-19 | 2007-07-19 | Lockheed Martin Corporation | Fast fourier transform correlation tracking algorithm with background correction |
CN101141633A (en) * | 2007-08-28 | 2008-03-12 | 湖南大学 | Moving object detecting and tracing method in complex scene |
CN101324956A (en) * | 2008-07-10 | 2008-12-17 | 上海交通大学 | Method for tracking anti-shield movement object based on average value wander |
CN101493944A (en) * | 2009-03-06 | 2009-07-29 | 北京中星微电子有限公司 | Moving target detecting and tracking method and system |
Non-Patent Citations (1)
Title |
---|
徐骁翔: "基于meanshift的运动目标检测与跟踪研究", 《中国优秀硕士学位论文电子期刊网》 * |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110177256A (en) * | 2019-06-17 | 2019-08-27 | 北京影谱科技股份有限公司 | A kind of tracking video data acquisition methods and device |
CN111314884A (en) * | 2020-02-25 | 2020-06-19 | 青海省地质测绘地理信息院 | Method and system for transmitting remote sensing image information of unmanned aerial vehicle at high speed |
CN111314884B (en) * | 2020-02-25 | 2022-07-01 | 青海省地质测绘地理信息院 | Method and system for transmitting remote sensing image information of unmanned aerial vehicle at high speed |
CN111464945A (en) * | 2020-04-07 | 2020-07-28 | 广州起妙科技有限公司 | Positioning method and system of terminal equipment |
CN113495270A (en) * | 2020-04-07 | 2021-10-12 | 富士通株式会社 | Monitoring device and method based on microwave radar |
CN117250996A (en) * | 2023-11-20 | 2023-12-19 | 中国人民解放军海军工程大学 | Method for searching movable target by unmanned cluster |
CN117250996B (en) * | 2023-11-20 | 2024-02-09 | 中国人民解放军海军工程大学 | Method for searching movable target by unmanned cluster |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108876806A (en) | Method for tracking target and system, storage medium and equipment based on big data analysis | |
US10607089B2 (en) | Re-identifying an object in a test image | |
US20190050994A1 (en) | Control method, non-transitory computer-readable storage medium, and control apparatus | |
CN110493488B (en) | Video image stabilization method, video image stabilization device and computer readable storage medium | |
CN109359602A (en) | Method for detecting lane lines and device | |
CN109344742A (en) | Characteristic point positioning method, device, storage medium and computer equipment | |
CN109255351B (en) | Three-dimensional convolution neural network-based bounding box regression method, system, equipment and medium | |
CN105405154A (en) | Target object tracking method based on color-structure characteristics | |
CN108765452A (en) | A kind of detection of mobile target in complex background and tracking | |
CN110232418B (en) | Semantic recognition method, terminal and computer readable storage medium | |
CN111914878A (en) | Feature point tracking training and tracking method and device, electronic equipment and storage medium | |
CN111354022A (en) | Target tracking method and system based on kernel correlation filtering | |
CN111246098A (en) | Robot photographing method and device, computer equipment and storage medium | |
CN109887001A (en) | Method for tracking target, device, computer equipment and storage medium | |
CN108844527B (en) | Method and system for acquiring engineering parameters of base station antenna, storage medium and equipment | |
KR101806453B1 (en) | Moving object detecting apparatus for unmanned aerial vehicle collision avoidance and method thereof | |
CN113284167B (en) | Face tracking detection method, device, equipment and medium | |
CN111553474A (en) | Ship detection model training method and ship tracking method based on unmanned aerial vehicle video | |
CN112733773B (en) | Object detection method, device, computer equipment and storage medium | |
CN109691185B (en) | Positioning method, positioning device, terminal and readable storage medium | |
CN112444251A (en) | Vehicle driving position determining method and device, storage medium and computer equipment | |
GB2582988A (en) | Object classification | |
CN109658441A (en) | Foreground detection method and device based on depth information | |
CN117274314A (en) | Feature fusion video target tracking method and system | |
CN114092850A (en) | Re-recognition method and device, computer equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20181123 |