CN104599286B - A kind of characteristic tracking method and device based on light stream - Google Patents

A kind of characteristic tracking method and device based on light stream Download PDF

Info

Publication number
CN104599286B
CN104599286B CN201310529938.8A CN201310529938A CN104599286B CN 104599286 B CN104599286 B CN 104599286B CN 201310529938 A CN201310529938 A CN 201310529938A CN 104599286 B CN104599286 B CN 104599286B
Authority
CN
China
Prior art keywords
characteristic point
tracking
value
characteristic
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201310529938.8A
Other languages
Chinese (zh)
Other versions
CN104599286A (en
Inventor
刘阳
张乐
陈敏杰
林福辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Spreadtrum Communications Tianjin Co Ltd
Original Assignee
Spreadtrum Communications Tianjin Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Spreadtrum Communications Tianjin Co Ltd filed Critical Spreadtrum Communications Tianjin Co Ltd
Priority to CN201310529938.8A priority Critical patent/CN104599286B/en
Publication of CN104599286A publication Critical patent/CN104599286A/en
Application granted granted Critical
Publication of CN104599286B publication Critical patent/CN104599286B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/207Analysis of motion for motion estimation over a hierarchy of resolutions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/269Analysis of motion using gradient-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • G06T2207/20032Median filtering

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

A kind of characteristic tracking method and device based on light stream, the method includes:Obtain the characteristic point contained by tracking window;The characteristic point is tracked based on sparse optical flow algorithm;When except the characteristic point after tracking being located at predeterminable area, characteristic point after the tracking is relocated, the predeterminable area is the region centered on intermediate value characteristic point, and the intermediate value characteristic point is that the sum in the characteristic point at a distance from other all characteristic points is the smallest characteristic point.By the method, during to feature point tracking, the trace point that may be mistake is passed through and is relocated, the accuracy of characteristic point can be improved, improve the accuracy of tracking result.

Description

A kind of characteristic tracking method and device based on light stream
Technical field
The present invention relates to technical field of image processing more particularly to a kind of characteristic tracking methods and device based on light stream.
Background technique
With the fast development of Detection for Moving Target, produce accordingly a variety of for being detected to moving target Method, such as the foundation such as the color characteristic based on moving target, motion information and motion model are corresponding in the prior art Detection method, and wherein the feature detection with tracking of moving target are the important foundation and key technology studied, such as can be right The feature of image sequence captured by the hand of the people being kept in motion, face carries out detection and tracking, and then may be implemented The identification of face for the gesture of people, people etc..
The detection method of color characteristic based on moving target has the sides such as average drifting, continuous adaptive average drifting The tracking of gesture of preferable people etc. may be implemented in method, such method under some simple scenarios.Movement based on moving target The detection method of information has optical flow method, Kalman filtering (Kalman Filter), particle filter (Particle Filter) etc. Method.Wherein, optical flow method can use variation of the intensity in time domain, airspace of pixel in the image sequence containing moving target, And then the sports ground (Motion Field) of the moving target is extrapolated, the final tracking realized for moving target.It is described Optical flow method according to calculate needed for pixel number, light stream can be divided into dense optical flow and sparse optical flow.In addition, being also based on The detection method of motion model initially sets up 2D the or 3D model of moving target in such method, for example, establish manpower 2D or 3D model is according to the actual situation iterated the parameter of the model of foundation, optimizes during to target following, thus So that it is constantly adapted to the variation of moving target, realizes the tracking for moving target.
It, usually can be by image sequence in optical flow method in the above-mentioned detection method based on motion information Several characteristic points carry out optical flow computations, realize the tracking and identification for moving target and the tracking to characteristic point, but During the tracking of above-mentioned optical flow method, figure that is relatively high, but obtaining under some complex scenes is required to the matching of image sequence Then possible matching degree is not high for picture sequence, thus causes the tracking to some characteristic points error occur, and then may cause to fortune There is the possibility of error or failure in the tracking and identification of moving-target.
The relevant technologies can refer to the U.S. Patent application of Publication No. US2013259317A1.
Summary of the invention
Problems solved by the invention is the problem to feature point tracking inaccuracy.
To solve the above problems, technical solution of the present invention provides a kind of characteristic tracking method based on light stream, the method Including:
Obtain the characteristic point contained by tracking window;
The characteristic point is tracked based on sparse optical flow algorithm;
When except the characteristic point after tracking being located at predeterminable area, the characteristic point after the tracking is relocated, it is described Predeterminable area is region centered on intermediate value characteristic point, the intermediate value characteristic point be in characteristic point after the tracking with it is other The distance of characteristic point after all tracking and be the smallest characteristic point.
Optionally, the process of the characteristic point contained by the acquisition tracking window of described image includes:
The autocorrelation matrix of all pixels point in the tracking window of image is obtained by following formula:
Wherein, M (x, y) indicates coordinate is the autocorrelation matrix of the pixel of (x, y), and i, j are the pixel point in tracking window Not index value in the x-direction and the z-direction, wi,jFor index value in the X direction be i, index value in the Y direction is at j The weighted value of pixel, K are the half width value of the tracking window, IxAnd IyBe respectively X-direction index value be i, in Y Index value on direction is pixel at j local derviation numerical value and local derviation numerical value in the Y direction in the X direction;
Based on the autocorrelation matrix of the pixel, the maximum eigenvalue and most of the autocorrelation matrix of the pixel is obtained Small characteristic value;
As λ (min)>When A × λ (max), determine that the pixel is the characteristic point contained by tracking window, wherein λ (max) for the pixel autocorrelation matrix maximum eigenvalue, λ (min) be the pixel autocorrelation matrix most Small characteristic value, A are characterized threshold value.
Optionally, the value of the characteristic threshold value is 0.001~0.01.
Optionally, the method also includes:After obtaining the characteristic point contained by tracking window, calculated based on sparse optical flow Before method tracks the characteristic point, illumination compensation is carried out to the characteristic point.
Optionally, described to include to characteristic point progress illumination compensation:
Based on formula Jn=λ × J+ δ carries out illumination compensation to the characteristic point contained by the tracking window, wherein λ is institute The gain coefficient of the brightness of characteristic point is stated, δ is the biasing coefficient of the brightness of the characteristic point, and J is before the characteristic point compensates Brightness value, JnFor the compensated brightness value of the characteristic point.
Optionally, the process to the characteristic point repositioning after the tracking includes:
The characteristic point after the tracking is relocated by formula N=R × M+ (1-R) × Nb, wherein N be it is described with The coordinate value after characteristic point repositioning after track, R are to update coefficient, and the value range of R is the numerical value between 0~1, and M is institute The coordinate value of intermediate value characteristic point is stated, Nb is the coordinate value before the characteristic point after the tracking relocates.
Optionally, the predeterminable area is centered on intermediate value characteristic point, with the half of the side length value of tracking window Length is the border circular areas of radius.
Optionally, the sparse optical flow algorithm is image pyramid optical flow algorithm.
Optionally, the method also includes:After relocating to the characteristic point after the tracking, it is based on the track window The tracking result of characteristic point in mouthful identifies the gesture of user.
Technical solution of the present invention also provides a kind of signature tracking device in light stream, and described device includes:
Acquiring unit, suitable for obtaining the characteristic point contained by tracking window;
Tracking cell, suitable for being tracked based on sparse optical flow algorithm to the characteristic point;
Unit is relocated, when being located at except predeterminable area suitable for the characteristic point after tracking, to the spy after the tracking Sign point relocates, and the predeterminable area is the region centered on intermediate value characteristic point, and the intermediate value characteristic point is the tracking In characteristic point afterwards at a distance from the characteristic point after other all tracking and be the smallest characteristic point.
Optionally, described device further includes:Compensating unit, suitable for being based on sparse light after the characteristic point for obtaining image Before flow algorithm tracks the characteristic point, illumination compensation is carried out to the characteristic point.
Optionally, described device further includes:Recognition unit, suitable for after the tracking characteristic point relocate after, Tracking result based on the characteristic point in the tracking window identifies the gesture of user.
Compared with prior art, technical solution of the present invention has the following advantages that:
After obtaining the characteristic point contained by tracking window, the characteristic point is tracked based on sparse optical flow algorithm, When except the characteristic point after tracking being located at predeterminable area, the characteristic point after the tracking is relocated, the predeterminable area For the region centered on intermediate value characteristic point, the intermediate value characteristic point be in the characteristic point at a distance from other all characteristic points And be the smallest characteristic point.Technical solution of the present invention is during signature tracking, to may not in the characteristic point after tracking Satisfactory trace point relocates, and the accuracy of characteristic point can be improved, to improve the accuracy of tracking result.
Further, before being tracked based on optical flow algorithm to characteristic point, by the method for illumination compensation, to the spy Pixel where sign point carries out illumination compensation, can effectively be adjusted to the image under different illumination conditions, improves different The Stability and veracity of feature point tracking under illumination condition.
Detailed description of the invention
Fig. 1 is the flow diagram for the characteristic tracking method based on light stream that technical solution of the present invention provides;
Fig. 2 is the flow diagram for the characteristic tracking method based on light stream that the embodiment of the present invention one provides;
Fig. 3 is the flow diagram of the characteristic tracking method provided by Embodiment 2 of the present invention based on light stream.
Specific embodiment
To solve the above-mentioned problems, technical solution of the present invention provides a kind of characteristic tracking method based on light stream, in the party In method, based on the tracking result of the characteristic point, when the position of characteristic point after tracking may have abnormal, such as positioned at When except predeterminable area, the characteristic point after tracking is relocated.
Fig. 1 is the flow diagram for the characteristic tracking method based on light stream that technical solution of the present invention provides, such as Fig. 1 institute Show, step S101 is first carried out, obtains the characteristic point contained by tracking window.
When tracking to moving target, need to determine tracking window first, it usually can be according to collected The size of image, determines the size of corresponding tracking window, and the tracking window can be by known to a person skilled in the art A variety of methods such as motion detection, background removal, Face Detection based on training pattern obtain, the tracking window includes Moving target is stated, for example, hand images, face image etc. can be contained in the tracking window.The tracking window is usually The region of regular shape, such as square area, rectangular region.
Optical flow method can be understood as calculating light stream to several pixels centered on characteristic point in image sequence, it can Light stream, and then the method that can be tracked based on calculated result to moving target are calculated with pixel where characteristic point.The picture Member is defined as the region containing several pixels centered on characteristic point in present specification.Optical flow method is being based on to feature When clicking through line trace, need to obtain the characteristic point in tracking window, the acquisition methods of the characteristic point in tracking window can use A variety of methods in the prior art are obtained, such as are obtained by Shi-Tomasi angle point algorithm, Harris algorithm etc., herein It is not especially limited.
Step S102 is executed, the characteristic point is tracked based on sparse optical flow algorithm.
Obtain tracking window contained by characteristic point after, then can by optical flow algorithm to the characteristic point carry out with Track, such as the sparse optical flow algorithm can be the sparse optical flow algorithm based on image pyramid.
Characteristic point can be extracted on the previous frame image of image sequence, to next frame image using described based on image gold The sparse optical flow algorithm of word tower tracks characteristic point, to obtain the characteristic point in previous frame image in next frame image Position.
Step S103 is executed, when except the characteristic point after tracking being located at predeterminable area, to the characteristic point after the tracking It relocates.The predeterminable area is the region centered on intermediate value characteristic point, and the intermediate value characteristic point is in the characteristic point With it is at a distance from other all characteristic points and be the smallest characteristic point.In present specification, predeterminable area will be located at after tracking Except characteristic point be known as specific characteristic point.
When being tracked based on optical flow algorithm to characteristic point, after feature point tracking being obtained based on the optical flow algorithm Position, if characteristic point is located at except predeterminable area, i.e., the characteristic point after tracking be specific characteristic point when, pass through what is gathered Mode is adjusted its position.
During being gathered, it is first determined an intermediate value characteristic point, to any one tracking in tracking window Characteristic point afterwards adds up its distance for arriving other all characteristic points, if cumulative calculation obtains its distance for arriving other characteristic points The sum of to be the smallest, then can as intermediate value characteristic point, can make in this way the intermediate value characteristic point being calculated be in The center of all characteristic points in track window does not consider that cumulative distance is biggish during calculating intermediate value characteristic point Characteristic point.
After obtaining intermediate value characteristic point, then a preset region can be determined, the predeterminable area can be with described Region centered on intermediate value characteristic point, the size of the predeterminable area can be 0.8~1.2 times of left side of current tracking window size Right region, usually centered on the intermediate value characteristic point, using region corresponding to current tracking window size as presetting Region.For example, the predeterminable area can be the border circular areas centered on the intermediate value characteristic point, the border circular areas Radius be associated with the tracking window side length.By taking tracking window is square region as an example, the border circular areas Radius can be the half length of the side length value of the tracking window.
After determining predeterminable area, then specific characteristic point can be gathered, or be referred to as assembling, can be adopted The specific characteristic point is gathered with the method for determining the certain speed gathered or step-length, variable step can also be used Algorithm the specific characteristic point is gathered, specific method is without limitation.
In the above-mentioned methods, during to feature point tracking, the trace point that may be mistake is passed through and is relocated, The accuracy that characteristic point can be improved effectively improves the accuracy of tracking result.
To make the above purposes, features and advantages of the invention more obvious and understandable, with reference to the accompanying drawings and examples The following further describes the technical solution of the present invention.
Embodiment one
In the present embodiment, after obtaining the characteristic point contained by tracking window, the sparse optical flow based on image pyramid Algorithm tracks the characteristic point, when except the characteristic point after tracking being located at predeterminable area, to the spy after the tracking Sign point relocates.
Fig. 2 is the flow diagram of the characteristic tracking method provided in this embodiment based on light stream, as shown in Fig. 2, first Step S201 is executed, determines the tracking window of image.
The tracking window can be determined according to the size of acquired image, and the tracking window can pass through this A variety of methods such as motion detection that field technical staff is known, background removal, Face Detection based on training pattern obtain.? It in the present embodiment, is illustrated by taking the gesture tracking to user as an example, the tracking window includes the hand images of user.
Step S202 is executed, Shi-Tomasi angle point algorithm is based on, obtains the characteristic point in the tracking window.
In the present embodiment, it is illustrated so that Shi-Tomasi angle point algorithm obtains the method for characteristic point as an example.
In Shi-Tomasi angle point algorithm, all pixels in the tracking window of image are obtained by formula (1) first The autocorrelation matrix of point.
Wherein, M (x, y) indicates coordinate is the autocorrelation matrix of the pixel of (x, y), and i, j are the pixel in tracking window The index value of point difference in the x-direction and the z-direction, wi,jFor index value in the X direction be i, index value in the Y direction is j The weighted value of the pixel at place, K are the half width value of the tracking window, IxAnd IyBe respectively X-direction index value be i, Index value in the Y direction is pixel at j local derviation numerical value and local derviation numerical value in the Y direction in the X direction.
Autocorrelation matrix based on all pixels point that formula (1) is calculated, obtains the auto-correlation square of the pixel The maximum eigenvalue λ (max) and minimal eigenvalue λ (min) of battle array.The maximum eigenvalue for obtaining autocorrelation matrix and minimum are special The method of value indicative is well known to those skilled in the art, and details are not described herein.
By formula (2) determine the pixel whether be image characteristic point.
λ(min)>A×λ(max) (2)
Wherein A is characterized threshold value, and the value of A is the numerical value between 0.001~0.01.
Usually when the pixel in tracking window meets formula (2), that is, it can determine that the pixel is the feature of image Point.
Step S203 is executed, the characteristic point in the tracking window is tracked based on sparse optical flow algorithm.
In the present embodiment, the sparse optical flow algorithm based on image pyramid carries out the characteristic point in the tracking window Tracking.
Since sparse optical flow can be understood as the image of several pixels centered on characteristic point between the image of consecutive frame Registration, it can light stream is calculated based on pixel where characteristic point, and then is tracked based on gesture of the calculated result to user.
In sparse optical flow algorithm based on image pyramid, it is normally based on the method for gradient to iterate to calculate light stream, It is realized using pyramidal mode by slightly to the estimation of essence, in the algorithm, original image is located at pyramidal most bottom Layer, higher layer is the down-sampling form of lower layer in the pyramid of image, when actually calculating, by pyramidal high-rise to bottom It carries out, after the optical flow computation of a certain layer comes out, the calculated result of the light stream based on this layer can be calculated one below adjacent The light stream of layer, the continuous iteration of this process carries out, until calculating the light stream of the original image of the bottom.
In the process that the sparse optical flow algorithm based on image pyramid tracks the characteristic point in the tracking window In, for two given frame consecutive images, the target of feature point tracking is to find a pixel I on a wherein frame image in phase Corresponding corresponding another pixel J with similar image intensity on adjacent other frame image.
When calculating the light stream of pixel, need using residual error function ξ (d) shown in formula (3).In the present embodiment In, using pixel I as tracking pixel.
Wherein, I, J are corresponding pixels between consecutive frame image, and d indicates calculative light stream, dxAnd dyRespectively need Component of the light stream to be calculated on x, the direction y;uxAnd uyRespectively position of the characteristic point in pixel I on the direction x and the direction y It sets;wxAnd wyHalf window width of the respectively pixel I on x, the direction y, I (x, y) are image intensity of the pixel I at x, y, J (x+ dx, y+dy) it is pixel J in (x+dx, y+dy) at image intensity.
It, usually can be using the side of gradient decline after obtaining being tracked the tracking residual error ξ (d) of pixel I by formula (3) Method iterates to calculate light stream.
Ideally, residual error function ξ (d) should be zero relative to the first differential of light stream d to be calculated, such as formula (4) shown in.
In specific calculate, can be calculated by formula (5)Value.
To J (x+dx,y+dy) be unfolded using first order Taylor, as a result as shown in formula (6).
MatrixRepresentative image gradient vector can be indicated by the formula as shown in formula (7).
Wherein,
δ I (x, y)=I (x, y)-J (x, y) representative image time-differential is enabled, is enabled Respectively represent airspace differential of the image on x, the direction y.
In order to reduce the operand in the iterative process of light stream, after picture breakdown is to a given layer, adjacent interlayer image Amount of exercise is sufficiently small by what is become, can be used at this timeWithSubstitutionWithThis substitution is full The assumed condition of sufficient light stream.
Based on above-mentioned analysis, then formula (6) can be rewritten as formula (8).
Based on formula (8), then available formula (9).
Then, it enables Then formula (9) can be rewritten as such as formula (10).
Then it is based on formula (10) available ideal light stream vector doptAs shown in formula (11).
dopt=G-1b (11)
It when actually calculating, if it is desired to obtaining the accurate solution of light stream, then needs to be iterated calculating, that is, uses formula (12) It is iterated calculating.
ηk=G-1bk (12)
Wherein, G is Hessian matrix (Hessian Matrix), bkGradient weighted residual vector when being kth time iteration (Gradient-weighted Residual Vector), ηkResidual error light stream when being kth time iteration.
Residual error light stream η when obtaining kth time iterationkAfterwards, then estimating when can obtain kth time iteration by formula (13) Photometric stream.
vk=vk-1k (13)
Wherein, vkEstimation light stream when being kth time iteration, vk-1It is the estimation light stream after -1 iteration of kth, ηkIt is that kth time changes For when residual error light stream.
After successive ignition, after reaching the condition of convergence or the number of iterations that meet, light stream d such as formula (14) is obtained It is shown.
Wherein k indicates preset the number of iterations or reaches the number of iterations for meeting the condition of convergence,Indicate the number of iterations The light stream value being calculated when reaching k.
Based on the available light stream being calculated on the image of single scale by successive ignition of formula (14), in order to The tracking for realizing Large Amplitude Motion target under complex scene is realized by the way of image pyramid by slightly estimating to the movement of essence Meter is first iterated calculating based on the method that the above-mentioned image in single scale obtains light stream on the image of rough scale, will The result calculated on the image of rough scale brings on the image of more fine dimension and is iterated calculating, and so on, finally Final optical flow computation result d is obtained by formula (15)last
Wherein, L is the level of image pyramid, L ∈ [0, Lm], LmIt is the highest number of plies of image pyramid, L=0 is to indicate Be original image, dLIt is the L layers of light stream result being calculated.
After the light stream for obtaining currently tracking pixel by formula (15), it can the position of the pixel after obtaining its tracking It sets, that is to say, that the position of the characteristic point after can determining tracking.
Step S204 is executed after step S203, judge whether to be made all characteristic points in tracking window with Track.If so, thening follow the steps S206;It is no to then follow the steps S205, it chooses next characteristic point and is tracked.
In step S205, from the characteristic point contained by tracking window, a characteristic point not being tracked is chosen.When When one characteristic point of selection is tracked in tracking window, it can be indicated, for indicating that it has been tracked It crosses.Then when executing step S205, the characteristic point not being labeled can be chosen in the characteristic point contained by tracking window For tracking.The method of selected characteristic point can be chosen using a variety of methods in tracking window, for example, can using with The mode that machine is chosen is chosen.
S203 is returned to step after step S205, continues to track the characteristic point of selection.
After all characteristic points in tracking window are all tracked, i.e., the judging result of step S204 is yes When, then follow the steps S206.
Step S206 calculates intermediate value characteristic point.
After being tracked to characteristic point contained in tracking window, after any one tracking in tracking window Characteristic point, the distance of the characteristic point after calculating separately this feature point to other all tracking, and calculate the characteristic point after the tracking The sum of the distance of characteristic point after to other all tracking, for each tracking after characteristic point can obtain one its arrive The sum of the distance of characteristic point after other tracking will be the smallest characteristic point work to the sum of the distance of the characteristic point after other tracking For intermediate value characteristic point.
Step S207 is executed, predeterminable area is determined based on the intermediate value characteristic point.
In the present embodiment, it is illustrated so that tracking window is square region, predeterminable area is border circular areas as an example, institute Stating predeterminable area can be border circular areas centered on the intermediate value characteristic point, and the radius of the border circular areas can be tracking The half length of the side length value of window.
Based on obtained in step S203 track after each characteristic point tracking result and in step S207 really The range of fixed predeterminable area, the spy after being tracked to all characteristic points in tracking window, after choosing a tracking Point is levied, step S208 is executed, whether the characteristic point after judging tracking is in except the predeterminable area range.Sentenced by this step Whether the characteristic point after disconnected current tracking is except the range of the predeterminable area.
If so, S209 is thened follow the steps, it is no to then follow the steps S211.
Step S209 determines that current signature point is specific characteristic point, relocates to the specific characteristic point.
Characteristic point after tracking is in except predeterminable area, then can determine that this feature point is specific characteristic point, need It is relocated.
In the present embodiment, the specific characteristic point can be relocated by formula (16).
N=R × M+ (1-R) × Nb (16)
In formula (16), the Nb of right side of the equal sign indicates coordinate of the specific characteristic point before update (relocating) Value, the N of left side of the equal sign indicate that the coordinate value obtained after updating to the specific characteristic point by formula (16), R are to update coefficient, The value range of R is the numerical value between 0~1, and M is the coordinate value of the intermediate value characteristic point.
The position of specific characteristic point can be determined by step S203, it can the coordinate value for determining specific characteristic point, During step S206 calculates intermediate value characteristic point, the position of the intermediate value characteristic point can be determined, it can determine in described The coordinate value of value tag point.
The characteristic point after current tracking can be relocated by step S209, execute step S210 later, sentence It is disconnected whether to the characteristic point after all tracking in tracking window it has been determined that i.e. whether to all tracking in tracking window after Characteristic point judged whether except the predeterminable area range.If so, thening follow the steps S212;It is no to then follow the steps S211, the characteristic point after choosing next tracking.
Step S211, the characteristic point after choosing next tracking.
In step S211, from the characteristic point after tracking contained by tracking window, chooses one and whether do not judged it Characteristic point outside predeterminable area.It, can be with when characteristic point is judged after choosing a tracking in tracking window It is indicated, for indicating that it had been judged.It, can be contained by tracking window then when executing step S211 The characteristic point not being labeled is chosen in characteristic point after tracking for carrying out judging whether it is in outside predeterminable area.With The method that the characteristic point after tracking is chosen in track window can be chosen using a variety of methods, for example, can be using random choosing The mode taken is chosen.
S208 is returned to step after step S211, judges whether the characteristic point chosen is in the predeterminable area range Except.
Whether the characteristic point after all tracking in tracking window has been judged except predeterminable area, that is, walks The judging result of rapid S210 is when being, to then follow the steps S212, the position pair based on the characteristic point in the tracking window The gesture of user identifies.
After tracking to all characteristic points in the tracking window, institute can be determined according to tracking result The position for stating the characteristic point in tracking window, the position letter before the location information and tracking after the tracking based on these characteristic points Breath can obtain change in location information, direction of motion change information etc. before and after feature point tracking, can be real based on these information Referring now to the gesture change information of user, and then can be carried out using gesture of the Gesture Recognition in the prior art to user Identification.
It should be noted that in the present embodiment, characteristic point after determining tracking is specific characteristic point, to described specified Characteristic point relocates, and chooses the characteristic point after next tracking again later, judges whether it is in except predeterminable area, that is, sentences Breaking, whether it is specified characteristic point, if it is, the specific characteristic is relocated surely again, and so on, until to institute Characteristic point after some tracking carries out corresponding operation.It in other embodiments, can also be to all characteristic points after tracking Successively judge whether it is specific characteristic point, after all specific characteristic points have been determined, then successively to all specific characteristics Point is relocated, and concrete mode is not limited thereto.
It should be noted that in the present embodiment, in the process that the characteristic point of the tracking window to current frame image carries out In, the method for use is that successively selected characteristic clicks through line trace in the tracking window of current frame image, until all features It puts after being tracked, then intermediate value characteristic point and predeterminable area are determined based on the tracking result of all characteristic points, and then judges institute Whether characteristic point in predeterminable area except is had in characteristic point after having tracking, to being in preset areas in the characteristic point after tracking Overseas characteristic point relocates, and is identified again based on gesture of the characteristic point after reorientation to user later.In other realities It applies in example, intermediate value characteristic point and predeterminable area can also be determined based on the tracking result of the characteristic point of previous frame image, to working as When the carry out signature tracking of prior image frame, can also choose a characteristic point in the tracking window of current frame image carry out with Track, and judge whether the characteristic point in present frame after the tracking is based on determined by previous frame image characteristic point tracking result Except predeterminable area, if it is, being relocated to it, other feature in present frame tracking window is otherwise chosen again and is clicked through Line trace, the tracking result based on this feature point and the preset areas based on determined by the feature point tracking result of previous frame image Domain, judges whether it needs to relocate, and so on, until carrying out corresponding operation to all characteristic points.
In the present embodiment, during to feature point tracking, the trace point that may be mistake is passed through and is relocated, The accuracy that characteristic point can be improved improves the accuracy of tracking result.
Embodiment two
In the present embodiment, in order to cope with more generally illumination variation, after obtaining the characteristic point contained by tracking window, Before the iterative calculation of light stream, illumination compensation can be carried out to the pixel where any feature point, image can be based on later Pyramidal sparse optical flow algorithm tracks the characteristic point, during tracking to the characteristic point, uses The method gathered relocates specific characteristic point.
Fig. 3 is the flow diagram of the characteristic tracking method provided in this embodiment based on light stream, as shown in figure 3, first Step S301 is executed, determines the tracking window of image.Please refer to one step S201 of embodiment.
Step S302 is executed, the characteristic point in the tracking window is obtained.
All characteristic points in the tracking window can be obtained based on Shi-Tomasi angle point algorithm, please refer to embodiment One step S202.
Step S303 is executed, illumination compensation is carried out to pixel where all characteristic points in tracking window.
Before tracking to the pixel by light stream iterative algorithm, illumination first is carried out to the pixel where characteristic point Compensation.
In the present embodiment, illumination compensation can be carried out using to biasing, the relevant linear change of gain, increased determining After beneficial coefficient and biasing coefficient, so that it may carry out light to pixel where characteristic point contained in tracking window by formula (17) According to compensation.
Jn=λ × J+ δ (17)
Wherein, λ is the gain coefficient of the brightness of the characteristic point, and δ is the biasing coefficient of the brightness of the characteristic point, and J is Brightness value before the characteristic point compensation, JnFor the compensated brightness value of the characteristic point.
In the present embodiment, light stream is calculated with pixel where characteristic point, thus in formula (17) corresponding each parameter meaning Justice can be understood as:λ is the gain coefficient of the brightness of pixel where the characteristic point, and δ is the bright of pixel where the characteristic point The biasing coefficient of degree, J are the brightness value before pixel compensation where the characteristic point, JnAfter pixel compensation where the characteristic point Brightness value.
The gain is exactly the value to brightness amplification, and biasing exactly to increase brightness value or the value of reduction, the spy The acquisition of the gain coefficient, the biasing coefficient of brightness of the brightness of pixel where sign point, is guaranteeing J and JnMean value having the same, Under conditions of variance, it can be obtained using a variety of methods known to a person skilled in the art.
In other embodiments, illumination can also be carried out to the pixel where characteristic point using the method for other illumination compensations Compensation, is not limited thereto.
After the pixel where to all characteristic points contained in tracking window carries out illumination compensation, then follow the steps S304 tracks the characteristic point in the tracking window based on sparse optical flow algorithm.
Step S305 is executed after step S304, judge whether to be made all characteristic points in tracking window with Track.If so, thening follow the steps S307;It is no to then follow the steps S306, it chooses next characteristic point and is tracked.
S304 is returned to step after step S306, continues to track the characteristic point of selection.
After all characteristic points in tracking window are all tracked, i.e., the judging result of step S305 is yes When, then follow the steps S307.
Step S307 calculates intermediate value characteristic point.
After being tracked to characteristic point contained in tracking window, after any one tracking in tracking window Characteristic point, the distance of the characteristic point after characteristic point to other all tracking after calculating separately the tracking, and after calculating the tracking Characteristic point to other all tracking after characteristic point sum of the distance, for each tracking after characteristic point can obtain One its sum of the distance for arriving the characteristic point after other tracking will be the smallest to the sum of the distance of the characteristic points after other tracking Characteristic point is as intermediate value characteristic point.
Step S308 is executed, predeterminable area is determined based on the intermediate value characteristic point.
In the present embodiment, it is still illustrated so that predeterminable area is border circular areas as an example, the predeterminable area can be The radius of border circular areas centered on the intermediate value characteristic point, the border circular areas can be with the two of the side length value of tracking window / mono- length.
Tracking result based on each characteristic point after the tracking obtained in step s 304 and in step S308 really The range of fixed predeterminable area, the spy after being tracked to all characteristic points in tracking window, after choosing a tracking Point is levied, step S309 is executed, whether the characteristic point after judging tracking is in except the predeterminable area range.Sentenced by this step Whether the characteristic point after disconnected current tracking is except the range of the predeterminable area.
If the judging result of step S309 be it is yes, then follow the steps S310, it is no to then follow the steps S312.
Step S310 determines that current signature point is specific characteristic point, relocates to the specific characteristic point.
The characteristic point after current tracking is relocated by step S310, executes step S311 later, judgement is Whether the characteristic point after no all tracking in tracking window is it has been determined that i.e. to the spy after all tracking in tracking window Sign point has judged whether except predeterminable area range.If so, thening follow the steps S313;It is no to then follow the steps S312, Characteristic point after choosing next tracking.
Step S312, the characteristic point after choosing next tracking.
S309 is returned to step after step S312, judges whether the characteristic point chosen is in the predeterminable area range Except.
Whether the characteristic point after all tracking in tracking window has been judged except predeterminable area, that is, walks The judging result of rapid S311 is when being, to then follow the steps S313, the position pair based on the characteristic point in the tracking window The gesture of user identifies.
Step S304 to step S313 please refers to embodiment one step S203 to step S212.
After tracking to all characteristic points in the tracking window, spy can be obtained based on tracking result Change in location information, direction of motion change information before and after sign point tracking etc., may be implemented based on these information for user's Gesture change information, and then the gesture of user can be identified using Gesture Recognition in the prior art.
In the present embodiment, right by the method for illumination compensation before being tracked based on optical flow algorithm to characteristic point Pixel where the characteristic point carries out illumination compensation, can effectively be adjusted, mention to the image under different illumination conditions The Stability and veracity of feature point tracking under high different illumination conditions.
Although present disclosure is as above, present invention is not limited to this.Anyone skilled in the art are not departing from this It in the spirit and scope of invention, can make various changes or modifications, therefore protection scope of the present invention should be with claim institute Subject to the range of restriction.

Claims (10)

1. a kind of characteristic tracking method based on light stream, which is characterized in that including:
Obtain the characteristic point contained by tracking window;
The characteristic point is tracked based on sparse optical flow algorithm;
When except the characteristic point after tracking being located at predeterminable area, the characteristic point after the tracking is relocated, it is described default Region is region centered on intermediate value characteristic point, the intermediate value characteristic point be in characteristic point after the tracking with it is other all The distance of characteristic point after tracking and be the smallest characteristic point;
The predeterminable area is centered on intermediate value characteristic point, using the half length of the side length value of tracking window as radius Border circular areas;
The process that the characteristic point to after the tracking relocates includes:
The characteristic point after the tracking is relocated by formula N=R × M+ (1-R) × Nb, wherein N is after the tracking Characteristic point relocate after coordinate value, R is to update coefficient, and the value range of R is the numerical value between 0~1, and M is in described The coordinate value of value tag point, Nb are the coordinate value before the characteristic point after the tracking relocates.
2. as described in claim 1 based on the characteristic tracking method of light stream, which is characterized in that contained by the acquisition tracking window The process of some characteristic points includes:
The autocorrelation matrix of all pixels point in the tracking window of image is obtained by following formula:
Wherein, M (x, y) indicates coordinate is the autocorrelation matrix of the pixel of (x, y), and i, j are the pixel point in tracking window Not index value in the x-direction and the z-direction, wi,jFor index value in the X direction be i, index value in the Y direction is at j The weighted value of pixel, K are the half width value of the tracking window, IxAnd IyBe respectively X-direction index value be i, in Y Index value on direction is pixel at j local derviation numerical value and local derviation numerical value in the Y direction in the X direction;
Based on the autocorrelation matrix of the pixel, maximum eigenvalue and the minimum for obtaining the autocorrelation matrix of the pixel are special Value indicative;
As λ (min)>When A × λ (max), determine that the pixel is the characteristic point contained by tracking window, wherein λ (max) is The maximum eigenvalue of the autocorrelation matrix of the pixel, λ (min) are the minimal characteristic of the autocorrelation matrix of the pixel Value, A are characterized threshold value.
3. as claimed in claim 2 based on the characteristic tracking method of light stream, which is characterized in that the value of the characteristic threshold value is 0.001~0.01.
4. as described in claim 1 based on the characteristic tracking method of light stream, which is characterized in that further include:Obtaining track window After characteristic point contained by mouthful, before being tracked based on sparse optical flow algorithm to the characteristic point, the feature is clicked through Row illumination compensation.
5. as claimed in claim 4 based on the characteristic tracking method of light stream, which is characterized in that described to be carried out to the characteristic point Illumination compensation includes:
Based on formula Jn=λ × J+ δ carries out illumination compensation to the characteristic point contained by the tracking window, wherein λ is the spy The gain coefficient of the brightness of point is levied, δ is the biasing coefficient of the brightness of the characteristic point, and J is the brightness before characteristic point compensation Value, JnFor the compensated brightness value of the characteristic point.
6. as described in claim 1 based on the characteristic tracking method of light stream, which is characterized in that the sparse optical flow algorithm is figure As pyramid optical flow algorithm.
7. as described in claim 1 based on the characteristic tracking method of light stream, which is characterized in that further include:To the tracking After characteristic point afterwards relocates, the tracking result based on the characteristic point in the tracking window knows the gesture of user Not.
8. a kind of signature tracking device based on light stream, which is characterized in that including:
Acquiring unit, suitable for obtaining the characteristic point contained by tracking window;
Tracking cell, suitable for being tracked based on sparse optical flow algorithm to the characteristic point;
Unit is relocated, when being located at except predeterminable area suitable for the characteristic point after tracking, to the characteristic point after the tracking It relocates, the predeterminable area is the region centered on intermediate value characteristic point, and the intermediate value characteristic point is after the tracking In characteristic point at a distance from the characteristic point after other all tracking and be the smallest characteristic point, the predeterminable area be with intermediate value Centered on characteristic point, using the half length of the side length value of tracking window as the border circular areas of radius;
The repositioning unit relocates the characteristic point after the tracking by formula N=R × M+ (1-R) × Nb, In, N is the coordinate value after the characteristic point after the tracking relocates, and R is to update coefficient, and the value range of R is between 0~1 Numerical value, M be the intermediate value characteristic point coordinate value, Nb be the tracking after characteristic point relocate before coordinate value.
9. the signature tracking device based on light stream as claimed in claim 8, which is characterized in that further include:Compensating unit is suitable for After the characteristic point for obtaining image, before being tracked based on sparse optical flow algorithm to the characteristic point, to the characteristic point Carry out illumination compensation.
10. the signature tracking device based on light stream as claimed in claim 8, which is characterized in that further include:Recognition unit is fitted In after relocating to the characteristic point after the tracking, the tracking result based on the characteristic point in the tracking window is to user Gesture identified.
CN201310529938.8A 2013-10-31 2013-10-31 A kind of characteristic tracking method and device based on light stream Active CN104599286B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310529938.8A CN104599286B (en) 2013-10-31 2013-10-31 A kind of characteristic tracking method and device based on light stream

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310529938.8A CN104599286B (en) 2013-10-31 2013-10-31 A kind of characteristic tracking method and device based on light stream

Publications (2)

Publication Number Publication Date
CN104599286A CN104599286A (en) 2015-05-06
CN104599286B true CN104599286B (en) 2018-11-16

Family

ID=53125036

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310529938.8A Active CN104599286B (en) 2013-10-31 2013-10-31 A kind of characteristic tracking method and device based on light stream

Country Status (1)

Country Link
CN (1) CN104599286B (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106204645A (en) * 2016-06-30 2016-12-07 南京航空航天大学 Multi-object tracking method
CN106204484B (en) * 2016-07-11 2020-07-24 徐州工程学院 Traffic target tracking method based on optical flow and local invariant features
CN107169458B (en) * 2017-05-18 2018-04-06 深圳云天励飞技术有限公司 Data processing method, device and storage medium
CN108428249A (en) * 2018-01-30 2018-08-21 哈尔滨工业大学深圳研究生院 A kind of initial position and orientation estimation method based on optical flow tracking and double geometrical models
CN108961308B (en) * 2018-06-01 2021-07-02 南京信息工程大学 Residual error depth characteristic target tracking method for drift detection
CN109598744B (en) * 2018-11-29 2020-12-08 广州市百果园信息技术有限公司 Video tracking method, device, equipment and storage medium
CN110322477B (en) * 2019-06-10 2022-01-04 广州视源电子科技股份有限公司 Feature point observation window setting method, tracking method, device, equipment and medium
CN111047626B (en) * 2019-12-26 2024-03-22 深圳云天励飞技术有限公司 Target tracking method, device, electronic equipment and storage medium
CN111402294B (en) * 2020-03-10 2022-10-18 腾讯科技(深圳)有限公司 Target tracking method, target tracking device, computer-readable storage medium and computer equipment
CN113496505B (en) * 2020-04-03 2022-11-08 广州极飞科技股份有限公司 Image registration method and device, multispectral camera, unmanned equipment and storage medium
CN111609868A (en) * 2020-05-29 2020-09-01 电子科技大学 Visual inertial odometer method based on improved optical flow method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130015982A (en) * 2011-08-05 2013-02-14 엘지전자 주식회사 Apparatus and method for tracking car
CN103136526A (en) * 2013-03-01 2013-06-05 西北工业大学 Online target tracking method based on multi-source image feature fusion
CN103325108A (en) * 2013-05-27 2013-09-25 浙江大学 Method for designing monocular vision odometer with light stream method and feature point matching method integrated

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9141196B2 (en) * 2012-04-16 2015-09-22 Qualcomm Incorporated Robust and efficient learning object tracker

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130015982A (en) * 2011-08-05 2013-02-14 엘지전자 주식회사 Apparatus and method for tracking car
CN103136526A (en) * 2013-03-01 2013-06-05 西北工业大学 Online target tracking method based on multi-source image feature fusion
CN103325108A (en) * 2013-05-27 2013-09-25 浙江大学 Method for designing monocular vision odometer with light stream method and feature point matching method integrated

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
视频序列中人脸检测光流跟踪技术研究;杨渊波;《中国优秀硕士学位论文全文数据库 信息科技辑》;20120615;第32-33页第4.2-4.3节,第38页第4.4.1节,第35-36页第4.3.2节,第43-44页第4.4.2节 *

Also Published As

Publication number Publication date
CN104599286A (en) 2015-05-06

Similar Documents

Publication Publication Date Title
CN104599286B (en) A kind of characteristic tracking method and device based on light stream
CN108898047B (en) Pedestrian detection method and system based on blocking and shielding perception
CN104867126B (en) Based on point to constraint and the diameter radar image method for registering for changing region of network of triangle
CN103455797B (en) Detection and tracking method of moving small target in aerial shot video
CN109949340A (en) Target scale adaptive tracking method based on OpenCV
CN107067415B (en) A kind of object localization method based on images match
CN111210477B (en) Method and system for positioning moving object
CN105184822B (en) A kind of target following template renewal method
CN109559320A (en) Realize that vision SLAM semanteme builds the method and system of figure function based on empty convolution deep neural network
CN108470354A (en) Video target tracking method, device and realization device
CN104933738B (en) A kind of visual saliency map generation method detected based on partial structurtes with contrast
CN105139015B (en) A kind of remote sensing images Clean water withdraw method
CN109974743B (en) Visual odometer based on GMS feature matching and sliding window pose graph optimization
CN104881029B (en) Mobile Robotics Navigation method based on a point RANSAC and FAST algorithms
CN107424161B (en) Coarse-to-fine indoor scene image layout estimation method
CN104794737B (en) A kind of depth information Auxiliary Particle Filter tracking
CN107657644B (en) Sparse scene flows detection method and device under a kind of mobile environment
CN105279769A (en) Hierarchical particle filtering tracking method combined with multiple features
CN103106667A (en) Motion target tracing method towards shielding and scene change
CN107607107A (en) A kind of Slam method and apparatus based on prior information
CN110501010A (en) Determine position of the mobile device in geographic area
CN106991691A (en) A kind of distributed object tracking being applied under camera network
CN108416258A (en) A kind of multi-human body tracking method based on human body model
CN108154159A (en) A kind of method for tracking target with automatic recovery ability based on Multistage Detector
CN105631872B (en) Remote sensing image registration method based on multi-characteristic points

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant