CN104680551B - A kind of tracking and device based on Face Detection - Google Patents

A kind of tracking and device based on Face Detection Download PDF

Info

Publication number
CN104680551B
CN104680551B CN201310633324.4A CN201310633324A CN104680551B CN 104680551 B CN104680551 B CN 104680551B CN 201310633324 A CN201310633324 A CN 201310633324A CN 104680551 B CN104680551 B CN 104680551B
Authority
CN
China
Prior art keywords
area
parameter
fitted ellipse
pixel
fitted
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201310633324.4A
Other languages
Chinese (zh)
Other versions
CN104680551A (en
Inventor
穆星
张乐
陈敏杰
林福辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Spreadtrum Communications Tianjin Co Ltd
Original Assignee
Spreadtrum Communications Tianjin Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Spreadtrum Communications Tianjin Co Ltd filed Critical Spreadtrum Communications Tianjin Co Ltd
Priority to CN201310633324.4A priority Critical patent/CN104680551B/en
Publication of CN104680551A publication Critical patent/CN104680551A/en
Application granted granted Critical
Publication of CN104680551B publication Critical patent/CN104680551B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

A kind of tracking and device based on Face Detection, methods described include:Oval calculating is fitted respectively at least one first area, to obtain the fitted ellipse parameter of each first area, the first area is the area of skin color in the first input picture;Relation between the distance and distance threshold μ of pixel and each first area based on second area, obtain the first parameter and the second parameter of the second area, the second area is the area of skin color in the second input picture, first parameter is the fitted ellipse parameter of first area corresponding to elliptic parameter is concentrated, and second parameter is that acquired fitted ellipse parameter is calculated based on the fitted ellipse carried out to the second area;The first parameter and the second parameter based on the second area, are tracked to the second area.This method can accurately be tracked to tracked area of skin color, and this method processing is simple, and amount of calculation is small, is easy to realize on mobile terminals.

Description

A kind of tracking and device based on Face Detection
Technical field
The present invention relates to technical field of image processing, more particularly to a kind of tracking and device based on Face Detection.
Background technology
In coloured image, because Skin Color Information is not influenceed by human body attitude, facial expression etc., there is relative stabilization Property, and because the color of the colour of skin and most of background objects has obvious difference so that Face Detection technology is in detection, hand Suffer from being widely applied in potential analysis, target following and image retrieval, the purpose of human body skin tone testing is automatic from image The exposed skin area of human body is oriented, such as the regions such as face, the hand of people are detected from image.
Meanwhile with the fast development of motion target tracking technology, generate accordingly a variety of for entering to moving target The method of line trace, in the prior art the foundation such as the color characteristic based on moving target, movable information, image information accordingly with Track method, such as the tracking of the color characteristic based on moving target have average drifting, continuous adaptive average drifting etc. Method, such method can realize the tracking of the gesture of preferable people etc., the fortune based on moving target under some simple scenarios The tracking of dynamic information has optical flow method, Kalman filtering(Kalman Filter), particle filter(Particle Filter) The methods of.
, can be to captured by the hand of the people that is kept in motion, face based on the method for above-mentioned moving object detection tracking The feature of image sequence be tracked, such as can detect people's from image based on human body skin tone testing method to above-mentioned The regions such as face, hand are tracked.During being tracked to moving object detection, to the feature detection of moving target with Track is the important foundation and key technology of research.
But in the prior art, all may can during moving target is detected and tracked using the above method There are problems that, for example, it is relatively low to the robustness of complex scene and illumination variation based on the method for color characteristic, based on motion The problem of method of information is likely difficult to adapt in any change of gesture, or tracking processing procedure, and amount of calculation is larger, and In above-mentioned tracking, exist when generation large area is blocked between multiple moving targets, it is more difficult to which it is accurate that moving target is carried out Tracking.
Correlation technique refers to Publication No. US2013259317A1 U.S. Patent application.
The content of the invention
Technical solution of the present invention solve be it is more difficult to tracking object accurately tracked, and tracking processing procedure in calculate Measure the problem of larger.
To solve the above problems, technical solution of the present invention provides a kind of tracking based on Face Detection, methods described Including:
Oval calculating is fitted respectively at least one first area, to obtain the fitted ellipse of each first area ginseng Number, the first area are the area of skin color in the first input picture;
Relation between the distance and distance threshold μ of pixel and each first area based on second area, described in acquisition The first parameter and the second parameter of second area, the second area be the second input picture in area of skin color, described first Parameter is the fitted ellipse parameter of first area corresponding to elliptic parameter is concentrated, and second parameter is based on to secondth area The fitted ellipse that domain is carried out calculates acquired fitted ellipse parameter;
The first parameter and the second parameter based on the second area, are tracked to the second area;
Wherein,
The fitted ellipse parameter includes the coordinate value of the central point of fitted ellipse;
The elliptic parameter collection includes the set of the fitted ellipse parameter of each first area;
The distance of the pixel and first area is the fitting that pixel concentrates the first area with the elliptic parameter The distance between oval central point.
Optionally, the area of skin color is obtained by the skin color detection method based on colour of skin model of ellipse.
Optionally, methods described also includes:
Pass through formula P (s/c)=γ × P (s/c)+(1- γ) × Pw(s/c) colour of skin model of ellipse is updated;Wherein, s For the pixel value of the pixel of input picture, c is the pixel value of skin pixel point, and P (s/c) is that the pixel is the general of colour of skin point Rate value, Pw(s/c) it is through the probable value that the pixel obtained by colour of skin model of ellipse is colour of skin point, γ in continuous w two field pictures For sensitivity parameter.
Optionally, the ellipsometer that is fitted to region asks for covariance square based on the pixel to the region at last Determined by battle array.
Optionally, the pass between the pixel based on second area and the distance and distance threshold μ of each first area System, the first parameter and the second parameter for obtaining the second area include:
If the distance of all pixels point of the second area and at least one first area is respectively less than μ, described second First parameter in region is the fitting of first area closest with the second area at least one first area Elliptic parameter, the second parameter of the second area calculate institute to carry out fitted ellipse to all pixels of second area point The fitted ellipse parameter of acquisition;
The number of pixel corresponding to the first area closest with the second area is most, and described first Pixel corresponding to region is to be less than the distance with other first areas in the second area with the distance of the first area Pixel.
Optionally, the pass between the pixel based on second area and the distance and distance threshold μ of each first area System, the first parameter and the second parameter for obtaining the second area include:
If the partial pixel point of the second area and N number of first area h1, h2 ..., hN distance be respectively less than μ, really The fixed second area have N number of first parameter A1, A2 ..., AN and N number of second parameter B1, B2 ..., BN, wherein, Aj first Region hj fitted ellipse parameter, Bj are the pixel of first set and second set to be fitted acquired in oval calculate Fitted ellipse parameter, the first set are the set of the partial pixel point, and the second set is in the second area The set of pixel in addition to the partial pixel point, corresponding to the hj of first area, the pixel corresponding to the first area hj The distance of point and first area hj is less than the pixel and the distance of other first areas, 1≤j≤N, N >=2.
Optionally, the span of the distance threshold μ is the numerical value between 1~2.
Optionally, the fitted ellipse parameter also includes long axis length, minor axis length and the anglec of rotation of fitted ellipse;
Based on formulaThe pixel of second area and the distance of first area are calculated, wherein,P is the pixel of second area,(X, y)For the coordinate value of p points, h is Fitted ellipse corresponding to first area, (xc,yc) be intended to be fitted ellipse central point coordinate value, α be fitted ellipse major axis Length, β are the minor axis length of fitted ellipse, and θ is the anglec of rotation of fitted ellipse.
Optionally, the pass between the pixel based on second area and the distance and distance threshold μ of each first area System, the first parameter and the second parameter for obtaining the second area include:
If all pixels point of the second area and the distance of any first area are all higher than μ, the second area The first parameter be sky, the second parameter of the second area is to carry out fitted ellipse to all pixels point of the second area Calculate acquired fitted ellipse parameter.
Optionally, methods described also includes:After being tracked to all second areas of continuous K frames, if the continuous K frames The all pixels point of all second areas is all higher than μ with the distance of same first area, then the fitting of the first area is ellipse Circle Parameters are concentrated from the elliptic parameter and deleted, wherein, K span is 5~20.
Optionally, methods described also includes:The elliptic parameter is concentrated into first area corresponding with the second area Fitted ellipse parameter be updated to the second parameter of the second area.
Optionally, methods described also includes:
Based on formula (xc+1, yc+1)=(xc, yc)+Δ c determines the seat of the central point of the fitted ellipse corresponding to the 3rd region Scale value (xc+1,yc+1), the 3rd region is area of skin color corresponding with second area in next frame input picture;
Wherein, Δ c=(xc, yc)-(xc-1, yc-1), (xc, yc) in fitted ellipse in the second parameter of second area The coordinate value of heart point, (xc-1, yc-1) for second area the first parameter in fitted ellipse central point coordinate value.
Technical solution of the present invention also provides a kind of tracks of device based on Face Detection, and described device includes:
First acquisition unit, suitable for being fitted oval calculating respectively at least one first area, to obtain every 1 The fitted ellipse parameter in one region, the first area are the area of skin color in the first input picture;
Second acquisition unit, suitable for the pixel based on second area and each first area distance and distance threshold μ it Between relation, obtain the first parameter and the second parameter of the second area, the second area is in the second input picture Area of skin color, first parameter be elliptic parameter concentrate corresponding to first area fitted ellipse parameter, second parameter To calculate acquired fitted ellipse parameter based on the fitted ellipse carried out to the second area;
Tracking cell, suitable for the first parameter and the second parameter based on the second area, the second area is carried out Tracking;
Wherein,
The fitted ellipse parameter includes the coordinate value of the central point of fitted ellipse;
The elliptic parameter collection includes the set of the fitted ellipse parameter of each first area;
The distance of the pixel and first area is the fitting that pixel concentrates the first area with the elliptic parameter The distance between oval central point.
Optionally, described device also includes:Updating block, suitable for the elliptic parameter is concentrated and the second area pair The fitted ellipse parameter for the first area answered is updated to the second parameter of the second area.
Optionally, described device also includes:Predicting unit, suitable for based on formula (xc+1, yc+1)=(xc, yc)+nc determines Coordinate value (the x of the central point of fitted ellipse corresponding to three regionsc+1,yc+1), the 3rd region is next frame input picture In area of skin color corresponding with second area;
Wherein, Δ c=(xc, yc)-(xc-1, yc-1), (xc, yc) in fitted ellipse in the second parameter of second area The coordinate value of heart point, (xc-1, yc-1) for second area the first parameter in fitted ellipse central point coordinate value.
Compared with prior art, technical scheme has advantages below:
To based on the area of skin color obtained by skin color detection method(First area and second area)It is fitted ellipsometer Calculate the fitted ellipse parameter corresponding to acquisition region, during tracking, the area of skin color based on current input image(Secondth area Domain)Pixel and each area of skin color in input picture before(Each first area)Pass between distance and distance threshold μ System, can accurately determine tracked area of skin color(Second area)Fitted ellipse parameter(First parameter and the second parameter), Change based on the fitted ellipse parameter of the tracked area of skin color during tracking, you can to the tracked colour of skin Region is accurately tracked, and this method processing is simple, and amount of calculation is small, is easy to realize on mobile terminals.
During being detected based on skin color detection method to area of skin color, the colour of skin ellipse to carrying out Face Detection Model optimizes, and the colour of skin model of ellipse after optimization can carry out the detection of adaptivity according to current input image information, Colour of skin model of ellipse after optimization has more preferable robustness to illumination, effectively improves the accuracy of the detection of area of skin color.
During tracking, the area of skin color based on current input image(Second area)Pixel and input before Each area of skin color in image(First area)Different relations between distance and distance threshold μ, difference is taken accordingly really The method of fixed first parameter and the second parameter so that can accurately determine fitting of the tracked area of skin color during tracking Elliptic parameter, during especially for there is multiple area of skin color mutually to block in current input image, this method still can be preferably right Each area of skin color is tracked respectively.
After tracking, fitted ellipse parameter and before defeated based on the area of skin color being traced in current input image Enter the fitted ellipse parameter of the tracked area of skin color in image, it is possible to achieve described in next frame input picture The prediction of the fitted ellipse parameter of tracked area of skin color.
Brief description of the drawings
Fig. 1 is the schematic flow sheet for the tracking based on Face Detection that technical solution of the present invention provides;
Fig. 2 is the schematic flow sheet for the tracking based on Face Detection that one embodiment of the invention provides;
Fig. 3 is the schematic flow sheet optimized to colour of skin model of ellipse that one embodiment of the invention provides;
Fig. 4 is the schematic flow sheet for the tracking based on Face Detection that another embodiment of the present invention provides.
Embodiment
In the prior art, during area of skin color is detected and tracked, to complex scene and illumination variation Robustness is relatively low, and when multiple area of skin color be present and be tracked, the multiple area of skin color effectively can not be tracked.
In order to solve the above problems, technical solution of the present invention provides a kind of tracking based on Face Detection, in the party In method, in the process of tracking, the pixel of the area of skin color based on current input image and each skin in input picture before Relation between the distance and distance threshold μ in color region, the fitted ellipse parameter of the area of skin color of current input image is determined, it is real Now to the tracking of the area of skin color of current input image.
Fig. 1 is the schematic flow sheet for the tracking based on Face Detection that technical solution of the present invention provides, such as Fig. 1 institutes Show, step S101 is first carried out, oval calculating is fitted respectively at least one first area, to obtain each first area Fitted ellipse parameter.
The first area is the area of skin color in the first input picture, and the area of skin color refers to the skin for being used for tracking Color region.First input picture can be the initial input image before current area of skin color is tracked, based on existing skill A variety of skin color detection methods in art can obtain the area of skin color included in first input picture, due to defeated in a frame Enter in image, an area of skin color may be included or include multiple area of skin color, so in first input picture Comprising be used for track area of skin color can be one or more, i.e., in this step, it is necessary to at least one colour of skin Region(First area)Oval calculating is fitted, the skin color detection method can be based on single Gauss model, mixed Gaussian mould Type, oval complexion model etc. realize the Face Detection for image.
To area of skin color(First area)Be fitted it is oval calculate it is ellipse to obtain fitting corresponding to the area of skin color Circle Parameters, the fitted ellipse parameter include coordinate value, the major axis of the central point of the fitted ellipse corresponding to the area of skin color Length, minor axis length and anglec of rotation etc..
Calculated based on the fitted ellipse, the fitting of each first area in first input picture can be obtained Elliptic parameter.
Step S102 is performed, between the distance and distance threshold μ of pixel and each first area based on second area Relation, obtain the first parameter and the second parameter of the second area.
The second area be the second input picture in area of skin color, second input picture can be include by The area of skin color of tracking(Second area)Current input image.
The distance of the pixel of the second area and each first area refers to the pixel and elliptic parameter of second area Concentrate the distance between central point of fitted ellipse of the first area, the elliptic parameter integrates as obtained by step S101 The set of the fitted ellipse parameter of each first area in first input picture.
First parameter of the second area refers to that elliptic parameter is concentrated and the first area corresponding to the second area Fitted ellipse parameter, second parameter is based on being obtained acquired in the fitted ellipse calculating carried out to the second area Fitted ellipse parameter.
Step S103 is performed, the first parameter and the second parameter based on the second area, the second area is carried out Tracking.
After the first parameter and the second parameter of second area is obtained based on step S102, due to the of the second area One parameter is the fitted ellipse parameter of the first area corresponding to second area, according to this parameter can obtain the second area it Preceding fitted ellipse information, and the second parameter is based on being obtained acquired in the fitted ellipse calculating carried out to the second area Fitted ellipse parameter, the current fitted ellipse information of the second area can be obtained according to second parameter, then based on above-mentioned Fitted ellipse information at different moments can realize the accurate tracking for the second area.
It is understandable to enable the above objects, features and advantages of the present invention to become apparent, below in conjunction with the accompanying drawings to the present invention Specific embodiment be described in detail.
In the present embodiment, fitted ellipse ginseng is obtained using fitted ellipse computational methods to the area of skin color in input picture Number, to during tracking, being corresponded to when tracked area of skin color is concentrated with multiple fitted ellipse parameters in elliptic parameter in it When tracking process illustrate.
Fig. 2 is the schematic flow sheet for the tracking based on Face Detection that the present embodiment provides, as shown in Fig. 2 first Step S201 is performed, the first area in the first input picture is detected based on colour of skin model of ellipse.
The first input picture is read first,, can be with if it is picture format of rgb space for the first input picture Color space conversion is carried out first, and it is converted into Ycbcr spaces from rgb space.
Because in YCbCr space, Y represents brightness, and Cb and Cr are color distinction signals, represent colourity, and different Under illumination condition, although the brightness of the color of object can produce very big difference, colourity has stable in very large range Property, it is held essentially constant, moreover, in the prior art, the result of study for also having correlation shows that the colour of skin of the mankind is in YCbCr space In distribution relatively concentrate, i.e. Clustering features of the colour of skin, it is not agnate between the difference of color mainly drawn by brightness Rise, and it is unrelated with color attribute, so using this characteristic, image pixel can be divided into the colour of skin and non-skin pixel, so In the present embodiment, in order to which the accuracy of area of skin color detection can be improved, image is transformed into from the rgb space generally used In YCbCr space.
Initial detecting can be carried out using the colour of skin model of ellipse trained in the prior art afterwards, it is initial to obtain One or more area of skin color included in input picture.
Because the colour of skin model of ellipse trained based on prior art is carrying out Face Detection, testing result may In the presence of some wrong detection zones, for example, it may be possible to cavitation present in area of skin color etc..So in the present embodiment In, Advance data quality first can be carried out to the area of skin color information in Face Detection result, in view of colour of skin object is connective and big It is small, by four connected region completion methods or eight connectivity area filling method, can eliminate empty existing for area of skin color in image Phenomenon.
In the present embodiment, following formula can be passed through based on the area of skin color information after Advance data quality(1)To the skin Color model of ellipse carries out model optimization.
P(s/c)=γ×P(s/c)+(1-γ)×Pw(s/c) (1)
Wherein, s be input picture pixel pixel value, c be skin pixel point pixel value, the P (s/ on the equation left side C) be the pixel is colour of skin point after optimization probable value, the P (s/c) on the right of equation for before optimization through obtained by colour of skin model of ellipse The pixel arrived be colour of skin point probable value, Pw(s/c) it is through the picture obtained by colour of skin model of ellipse in continuous w two field pictures Vegetarian refreshments is the probable value of colour of skin point, and γ is sensitivity parameter.
After being optimized to colour of skin model of ellipse, it can return and re-read the first input picture, can then carry out Color space is changed, and can carry out Face Detection again based on the colour of skin model of ellipse after renewal afterwards, still can be with after detection Advance data quality is carried out to the area of skin color information in Face Detection result, if the area of skin color information after optimization thinks satisfied, Then can be based on area of skin color information extraction one or more specific area of skin color after current optimization, if dissatisfied Area of skin color information after can continuing based on Advance data quality, passes through formula(1)It is excellent that model is carried out to colour of skin model of ellipse again Change, untill the area of skin color information after Advance data quality and model optimization meets that user requires.
For said process incorporated by reference to reference to figure 3, Fig. 3 is the schematic flow sheet optimized to colour of skin model of ellipse.
It should be noted that during initial detecting is carried out based on colour of skin model of ellipse, can use as described above Advance data quality and at least one of model optimization method optimize.
After step S201, step S202 is performed, each first area in first input picture is intended respectively Close oval calculate.
Based on step S201, one or more area of skin color in the first input picture can be obtained, based on the skin Color region is that can determine that each first area in the first input picture.
In view of colour of skin object during actual motion, juxtaposition may be present, so detect The number of area of skin color may be not equal to the area of skin color for tracking, and in present specification, the first area refers to It is used for the area of skin color of tracking.
In the present embodiment, illustrated exemplified by containing multiple first areas in the first input picture.
Because the shape of the colour of skin object such as face, human hand is approximately elliptical shape, it is possible to calculated by fitted ellipse Elliptical shape is fitted to by multiple first areas contained in the first input picture are corresponding respectively, the elliptical shape is usual Such as formula can be passed through(2)Shown model of ellipse is indicated.
H=h (xc, yc, α, β, θ)(2)
Wherein, h represents the fitted ellipse corresponding to the first area, xc,ycFor the fitting corresponding to the first area The coordinate value of oval central point, α are the long axis length of the fitted ellipse corresponding to the first area, and β is firstth area The minor axis length of fitted ellipse corresponding to domain, θ are the anglec of rotation of the fitted ellipse corresponding to the first area.
In the present embodiment, covariance matrix can be asked for based on the pixel to the first area to firstth area Domain is fitted oval calculating.
Illustrated by taking a first area in first input picture as an example, because the first area corresponds to The continuous skin pixel point of cluster, it is possible to which covariance matrix ∑ is asked for the skin pixel point in the first area.
Specifically, it is assumed that make X=[x1…xn] represent that the X-direction of pixel point set is vectorial, Y=[y1…yn] represent pixel point set Y-direction vector, wherein, x1...xnRepresent the coordinate of the X-direction of the skin pixel point in the first area, y1...ynTable Show the coordinate of the Y-direction of the skin pixel point in the first area, n represents of the skin pixel point in the first area Number.
OrderThen covariance matrix ∑ can pass through formula(3)Obtain.
∑=E((Z-E(Z))(Z-E(Z))T) (3)
Wherein, E represents to ask for mathematic expectaion.
Then covariance matrix ∑ is in formula(3)Shown vector calculates, the matrix of substantially one 2 × 2, and it can be with It is expressed as formula(4)Form.
Wherein, the X-direction of each element representation pixel point set in covariance matrix ∑ is vectorial, between Y-direction vector Covariance.
Based on formula(5)The long axis length α of the fitted ellipse corresponding to the first area can be obtained.
Wherein,
Based on formula(6)The long axis length β of the fitted ellipse corresponding to the first area can be obtained.
Wherein,
Based on formula(7)The anglec of rotation θ of the fitted ellipse corresponding to the first area can be obtained.
For the center point coordinate value (x of the fitted ellipse corresponding to the first areac,yc) can be to described first During region is fitted, the coordinate value of the pixel on the border of the first area is utilized, you can obtain.
So far, you can obtain the initial fitted ellipse parameter of the first area, the initial fitted ellipse parameter includes Coordinate value, long axis length, minor axis length and the anglec of rotation of the central point of fitted ellipse corresponding to the first area.
Step S203 is performed, elliptic parameter collection is set.
Based on step S202, the fitted ellipse parameter of each first area in the first input picture can be obtained.
By the fitted ellipse parameter setting of each first area in the first input picture in same set, form described ellipse Circle Parameters collection.
Step S204 is performed, each second area in the second input picture is detected based on colour of skin model of ellipse.
For current input image, i.e., described second input picture, colour of skin model of ellipse can be based on, is currently inputted Each area of skin color included in image, it is that can determine that each second area in the second input picture based on the area of skin color, Specifically it refer to step S201.
Each second area can be tracked afterwards.
Step S205 is performed, calculates the pixel of second area and the distance of each first area.
Illustrated exemplified by being tracked to one of second area, calculate tracked the by this step first The all pixels point in two regions and the distance of each first area.
Based on formula(8)Calculate can obtain each pixel of tracked second area and each first area away from From.
Wherein,P is the pixel of second area,(X, y)For p points Coordinate value, h be first area corresponding to fitted ellipse, (xc,yc) for the central point of the fitted ellipse corresponding to first area Coordinate value, α is the long axis length of the fitted ellipse corresponding to first area, and β is the fitted ellipse corresponding to first area Minor axis length, θ are the anglec of rotation of the fitted ellipse corresponding to first area.
For any one first area, formula can be based on(8)Obtain each picture of tracked second area Distance of the vegetarian refreshments to the first area.
Step S206 is performed, determines the first parameter and the second parameter of second area.
Each pixel of tracked second area is being obtained based on step S205 to the distance of the first area Afterwards, it is possible to the fitting of the first area according to corresponding to the distance determines tracked second area in elliptic parameter concentration Elliptic parameter.
In the present embodiment, the plan of multiple corresponding first areas is concentrated with elliptic parameter with tracked second area Illustrated exemplified by conjunction elliptic parameter.
Distance based on each pixel of tracked second area to the first area, it can be determined that the pixel Whether point is located at determined by the fitted ellipse parameter of the first area in oval scope.
Formula is based on generally, it is considered that working as(8)A pixel of tracked second area is calculated to one first During the distance D (p, h) in region≤1, then it is assumed that the pixel is located in the fitted ellipse corresponding to the first area, i.e. the picture Vegetarian refreshments is located at determined by the fitted ellipse parameter of the first area in the range of fitted ellipse, the fitted ellipse of the first area Parameter is properly termed as the fitted ellipse parameter of the first area corresponding to the second area, it is also assumed that by first area Fitted ellipse determined by fitted ellipse parameter is the target fitted ellipse of the previous moment of the second area.Will be with described The fitted ellipse parameter of two first areas corresponding to region is referred to as the first parameter of the second area.
But due to the scrambling of the colour of skin object such as hand, if too small to the fitted ellipse of colour of skin object, influence whether The accuracy of tracking result, so here for relatively good tracking effect can be obtained, can be calculated it is tracked One pixel of second area is to after the distance of a first area, pass that can be based on the distance with distance threshold μ System, the pixel is judged whether in the fitted ellipse corresponding to the first area, the distance threshold μ can be according to reality Border tracks situation, tracks the size of object, tracks the factor progress such as the complexity of scene and the computational methods of covariance matrix Corresponding setting, in present specification, the value of the distance threshold μ can be the numerical value between 1~2.
If it is based on formula(8)It is calculated, there is the distance of partial pixel point and multiple first areas in the second area Respectively less than distance threshold μ, it is assumed that have in the second area partial pixel point and N number of first area h1, h2 ..., hN distance Respectively less than μ, you can to think the partial pixel point while be located at determined by the fitted ellipse parameter of N number of first area In the range of fitted ellipse, then it is also assumed that this N number of first area h1, h2 ..., may mutually overlap each other between hN, I.e. possible different first area may occur mutually to block in scene is tracked, the fitted ellipse parameter of this N number of first area Identified fitted ellipse can consider the target fitted ellipse of the second area, and the second area has N number of accordingly One parameter A1, A2 ..., AN, N number of first parameter is that elliptic parameter concentrates corresponding with the second area N number of firstth area The fitted ellipse parameter in domain.
Because second area has N number of target fitted ellipse, i.e., the fitted ellipse parameter of N number of first area is corresponding, then Now need to be directed to N number of different first area hj, oval calculating be fitted to the second area accordingly, so as to based on Fitted ellipse result of calculation, the second area is tracked.
For example, for the hj of any one first area, the pixel in second area, except the partial pixel Point is respectively less than outside distance threshold μ with the distances of all first areas, may have distance hj in remaining pixel closely, Have that distance hk is closer, now calculated if the closer pixels of distance hk are also used for into fitted ellipse, so obtain with The fitted ellipse parameter of first area hj corresponding second area, it is clear that there may be problem.
So in present specification, in the fitted ellipse of pair second area corresponding with first area hj calculates, adopt With the partial pixel point(First set)With the remaining pixel in the pixel of second area in addition to the partial pixel point Pixel nearest distance hj in point(Second set), based on pair fitted ellipse of second area corresponding with first area hj Calculate, pixel nearest the distance hj be referred to as be in second area with first area hj corresponding to pixel, institute The distance for stating pixel and first area hj corresponding to the hj of first area is less than the pixel and the distance of other first areas.
Similarly, for the hk of first area, calculated in the fitted ellipse of pair second area corresponding with first area hk In, using the remaining pixel middle-range in the pixel of the partial pixel point and second area in addition to the partial pixel point The pixel nearest from hk, the fitted ellipse for pair second area corresponding with first area hk calculate, the like, for N number of different first area corresponding to second area, should mutually n times fitted ellipse calculating be carried out to second area, obtain N number of difference With first area h1, h2 ..., the one-to-one fitted ellipse parameter B1, B2 of hN ..., BN, by the fitted ellipse parameter B1, B2 ..., BN be referred to as the second parameter of the second area, wherein, 1≤j≤N, N >=2, refer to step S202 in detail.
So far obtain second area and be directed to multiple different target fitted ellipses, multiple corresponding first parameters and the second ginseng Number.
Step S207 is performed, the first parameter and the second parameter based on the second area, the second area is carried out Tracking.
Because the first parameter of the second area is fitted ellipse parameter corresponding to first area, it is believed that be described Fitted ellipse parameter corresponding to second area previous moment, and the second parameter of the second area is to be intended at current time Close oval the first parameter and second for calculating resulting the fitted ellipse parameter, the then second area corresponding with the first parameter Parameter, the fitted ellipse parameter of the second area at different moments can be accurately determined, can be to the motion of the second area Situation is determined, and realizes the tracking for second area.
It is determined that tracked second area target fitted ellipse when, by the second area being calculated and elliptic parameter The distance of first area is concentrated compared with distance threshold, equally can be according to actual tracking situation, it is determined that suitable distance Threshold value so that more accurate for the tracking result of second area.
This method is for there is multiple area of skin color in input picture(First area)When mutually blocking, this method still can be compared with Good area of skin color is tracked.
What above-described embodiment provided is when tracked area of skin color is concentrated with multiple fitted ellipse parameters in elliptic parameter Tracking process when corresponding, during actual tracking, tracked area of skin color and the fitting that elliptic parameter is concentrated are ellipse The relation of Circle Parameters may also have a variety of different situations.
Fig. 4 is the schematic flow sheet for the tracking based on Face Detection that another embodiment of the present invention provides, in this reality Apply in example, for the different situation of tracked colour of skin object, carry out different tracking processing.
Due to during tracking, it usually needs consider following several situations:A, there is the new colour of skin in scene is tracked Object;B, the colour of skin object tracked before disappear from tracking scene;C, tracked colour of skin object continuously move in the scene; D, different colour of skin objects block in scene is tracked.It is right for the different colour of skin object situations of above-mentioned tetra- kinds of A, B, C and D For the elliptic parameter collection set, the generation of target fitted ellipse, the release of target fitted ellipse, mesh can be accordingly corresponding with Mark fitted ellipse continuous tracking, target fitted ellipse it is overlapping situations such as, mutually tackle colour of skin object corresponding to area of skin color Being tracked processing also has different, and the target fitted ellipse is that the fitting corresponding to first area described above is ellipse Circle.
In the present embodiment, described tetra- kinds of different situations of A, B, C and D, the tracking situation of colour of skin object are illustrated.
As shown in figure 4, step S401 is first carried out, the firstth area in the first input picture is detected based on colour of skin model of ellipse Domain.
Step S402 is performed, oval calculating is fitted respectively to each first area in first input picture.
Step S403 is performed, elliptic parameter collection is set.
Step S404 is performed, each second area in the second input picture is detected based on colour of skin model of ellipse.
Step S405 is performed, calculates the pixel of second area and the distance of each first area.
Step S401 to step S405 refer to step S201 to step S205.
Based on the distance of pixel and each first area of identified second area in step S405, can divide following a, B, tetra- kinds of situations of c and d are determined to second area.
If as shown in figure 4, all pixels point of the second area and the distance of any first area are all higher than apart from threshold Value μ, it is determined that be a situations.
If after being tracked to all second areas of continuous K frames, all pixels of all second areas of the continuous K frames Point and the distance of same first area are all higher than distance threshold μ, it is determined that are b situations.
If all pixels point of the second area and the distance of at least one first area are respectively less than distance threshold μ, It is defined as c situations.
If the partial pixel point of the second area and N number of first area h1, h2 ..., hN distance is respectively less than apart from threshold Value μ, it is determined that be d situations.
Processing is tracked for tetra- kinds of different situations of a, b, c and d.
If a situations, as shown in figure 4, then performing step S406, the first parameter of the second area is arranged to empty, Acquired fitted ellipse parameter is calculated as described second using fitted ellipse is carried out to all pixels point of the second area Second parameter in region.
, then can be with when all pixels point of the second area and the distance of any first area are all higher than distance threshold μ Determine that the second area concentrates the first area without corresponding in elliptic parameter, i.e., it is not belonging to any one first area institute Corresponding fitted ellipse, the second area should track area of skin color corresponding to the colour of skin object that scene center occurs, then First parameter of the second area is arranged to empty, the second parameter of the second area is to own to the second area Pixel is fitted the fitted ellipse parameter acquired in oval calculate, and all pixels of the second area will can be clicked through Row fitted ellipse calculates the newly-increased elliptic parameter that is added to of acquired fitted ellipse parameter and concentrated, to institute in input picture afterwards When stating second area and being tracked, concentrated based on the fitted ellipse parameter renewal elliptic parameter that institute's fitted ellipse is calculated afterwards Fitted ellipse parameter corresponding to the second area.
If b situations, then step S407 is performed, the fitted ellipse parameter of the first area is concentrated from the elliptic parameter Delete.
If after being tracked to all second areas of continuous K frames, all pixels of all second areas of the continuous K frames Point and the distance of same first area are all higher than distance threshold μ, then can determine any one second area all with this first The distance in region is more than distance threshold μ, namely explanation tracking object corresponding with the first area in former frame has disappeared, then The fitted ellipse parameter of the first area can be concentrated from the elliptic parameter and deleted.
Need exist for accounting for the image information of continuous K frames, be due to during Face Detection, sometimes may be used The situation that Skin Color Information is lost in frame image information can be appeared in, so in a frame input image information, all secondth areas When all pixels point in domain is all higher than distance threshold μ with the distance of same first area, it can take not to the first area Fitted ellipse parameter be updated, if below in a frame input picture, detect again the pixel of second area with this The distance in one region is respectively less than distance threshold μ, then can continue the fitted ellipse parameter to the first area using the above method It is updated, if this occurs for continuous K frames, it may be determined that tracking object has disappeared, then can be by the first area Fitted ellipse parameter is concentrated from the elliptic parameter and deleted.K span can be 5~20.
If c situations, then step S408 is performed, the first parameter of the second area is at least one first area In the first area closest with the second area fitted ellipse parameter, the second parameter of the second area is to institute The all pixels point for stating second area carries out fitted ellipse and calculates acquired fitted ellipse parameter, and first based on second area Parameter and the second parameter are tracked.
If all pixels point of the second area and the distance of at least one first area are respectively less than distance threshold μ, Illustrate to concentrate in elliptic parameter and there may be one or more corresponding first area.
If all pixels point of the second area concentrates the distance of a first area to be less than μ with the elliptic parameter, The distance for concentrating other first areas with the elliptic parameter is all higher than μ, it is determined that the first area is the second area pair The first area answered, then the first parameter of the second area be the first area fitted ellipse parameter, the second area The second parameter be to carry out fitted ellipse to all pixels point of the second area to calculate acquired fitted ellipse parameter.
If all pixels point of the second area concentrates the distance of multiple first areas to be respectively less than with the elliptic parameter μ, generally it is considered that two different first areas can not possibly be the colour of skin area corresponding to same tracked area of skin color Domain, it is possible to which the firstth area corresponding to the second area is determined based on the distance between second area and different first areas Domain.
Specifically, the first parameter of the second area is apart from most in the multiple first area with the second area The fitted ellipse parameter of near first area, the second parameter of the second area are all pixels point to the second area The fitted ellipse parameter acquired in oval calculate is fitted, the first area institute closest with the second area is right The number for the pixel answered is most, the pixel corresponding to the first area be the second area in the first area Distance is less than the pixel with the distance of other first areas.
For by taking two first areas U and V as an example, when a pixel in the second area is with first area U's Distance be less than with first area V apart from when, it is determined that the pixel be first area U corresponding to pixel, when the secondth area In domain with the number of the pixel corresponding to first area it is more when, it is determined that first area U be and second area distance Nearest first area.
If d situations, as shown in figure 4, then perform step S409, determine the second area have N number of first parameter A1, A2 ..., AN and N number of second parameter B1, B2 ..., BN, the first parameter and the second parameter based on second area are tracked.
If the partial pixel point of the second area and N number of first area h1, h2 ..., hN distance is respectively less than apart from threshold Value μ, then illustrate have partial pixel point may be simultaneously within the fitted ellipse of N number of first area, i.e., in the second area Possible different first area may occur mutually to block in scene is tracked, it in tracking situation such as present specification now Concrete condition described by preceding embodiment, the specific processing procedure that tracks refer to S206 and step S207, will not be repeated here.
In the present embodiment, different tracking processing is carried out accordingly for different situations so that for what is be traced Area of skin color in different situations, can accurate and effective be tracked.
To current input image, i.e., after the second area in described second input picture is tracked, for the ease of right Second area continues to track described in next frame input picture, the tracking based on Face Detection of the embodiments of the present invention It can further include, the fitted ellipse that the elliptic parameter concentrates first area corresponding with the second area is joined Number, is updated to present frame input picture(Second input picture)The second area the second parameter.Afterwards to next frame When the second area in input picture is tracked, can with elliptic parameter it is centrally updated after the corresponding to the second area First parameter of the fitted ellipse parameter in one region as the second area, then to secondth area in next frame input picture Domain, its second parameter is determined based on method provided in an embodiment of the present invention, based on the second area in next frame input picture First parameter and the second parameter realize the tracking to the second area in next frame input picture, the like, can with During track, real-time synchronization updates the elliptic parameter collection.
The distance between pixel that can be based on second area and first area determine corresponding with the second area First area.
By taking a situations in the embodiment shown in Fig. 4 as an example, if all pixels point of the second area and any firstth area The distance in domain is all higher than distance threshold μ, then the second area concentrates not corresponding first area in elliptic parameter, can incite somebody to action The fitted ellipse parameter carried out to all pixels point of the second area acquired in fitted ellipse calculating, which increases newly, is added to oval ginseng In manifold, when being tracked in input picture afterwards to the second area, elliptic parameter can be newly increased with above-mentioned Elliptic region is as first area corresponding to the second area determined by the fitted ellipse parameter of concentration.
Again by taking the c situations in the embodiment shown in Fig. 4 as an example, if all pixels point of the second area and the ellipse The distance of a first area is less than μ in parameter set, and the distance for concentrating other first areas with the elliptic parameter is all higher than μ, It is first area corresponding to the second area then to determine the first area, if all pixels point of the second area with it is described Elliptic parameter concentrates the distance of multiple first areas to be respectively less than μ, and closest first area is defined as with the second area First area corresponding to the second area.
If by taking the d situations in the embodiment shown in Fig. 4 as an example, if having partial pixel point and multiple the in the second area The distance in one region is respectively less than distance threshold μ, then this multiple first area is properly termed as first corresponding to the second area Region, specifically determine that first area corresponding to the second area may be referred to the above method and be determined.
In addition, in view of when the colour of skin object such as human hand, face moves in moving scene, although may be irregular motion Track, but between consecutive frame colour of skin object motion can approximation regard as linear movement, can thus be based on present frame and The center point coordinate value of the fitted ellipse of former frame input picture, predict described in next frame input picture corresponding to colour of skin object Fitted ellipse center point coordinate value, during prediction, the other parameters of fitted ellipse can keep constant.
Specifically, the first parameter and the second parameter based on second area, pass through formula(9)Can be with the area of real-time estimate the 3rd Coordinate value (the x of the central point of fitted ellipse corresponding to domainc+1,yc+1)。
(xc+1, yc+1)=(xc, yc)+Δc (9)
Wherein, Δ c=(xc, yc)-(xc-1, yc-1), (xc, yc) in fitted ellipse in the second parameter of second area The coordinate value of heart point, (xc-1, yc-1) for second area the first parameter in fitted ellipse central point coordinate value.Described 3rd Region is area of skin color corresponding with the second area in next frame input picture.
It should be noted that in the embodiment as shown in fig.4, if when a situations occur, occur in new colour of skin object When, be the area of skin color in unpredictable next frame corresponding to the colour of skin object center point coordinate value, only when to newly going out Area of skin color corresponding to the colour of skin object passes through fitted ellipse meter in the present frame and next frame input picture of existing colour of skin object After calculation, then the fitted ellipse result of calculation of area of skin color that can be in the input picture based on initial two frame, in subsequent frames The fitted ellipse parameter of the area of skin color be predicted.
Although present disclosure is as above, the present invention is not limited to this.Any those skilled in the art, this is not being departed from In the spirit and scope of invention, it can make various changes or modifications, therefore protection scope of the present invention should be with claim institute The scope of restriction is defined.

Claims (13)

  1. A kind of 1. tracking based on Face Detection, it is characterised in that including:
    Oval calculating is fitted respectively at least one first area, to obtain the fitted ellipse parameter of each first area, The first area is the area of skin color in the first input picture;
    Relation between the distance and distance threshold μ of pixel and each first area based on second area, obtain described second First parameter and the second parameter in region, the second area be the second input picture in area of skin color, first parameter For elliptic parameter concentrate corresponding to first area fitted ellipse parameter, second parameter is based on entering to the second area Capable fitted ellipse calculates acquired fitted ellipse parameter;
    The first parameter and the second parameter based on the second area, are tracked to the second area;
    Wherein,
    The fitted ellipse parameter includes the coordinate value of the central point of fitted ellipse;
    The elliptic parameter collection includes the set of the fitted ellipse parameter of each first area;
    The distance of the pixel and first area is the fitted ellipse that pixel concentrates the first area with the elliptic parameter The distance between central point;
    Relation between the distance and distance threshold μ of the pixel based on second area and each first area, described in acquisition The first parameter and the second parameter of second area include:
    The all pixels point of the second area and the distance of at least one first area be respectively less than μ, then the second area First parameter is the fitted ellipse ginseng of first area closest with the second area at least one first area Number, the second parameter of the second area are that all pixels point of the second area is carried out acquired in fitted ellipse calculating Fitted ellipse parameter;
    If the partial pixel point of the second area and N number of first area h1, h2 ..., hN distance be respectively less than μ, it is determined that institute State second area have N number of first parameter A1, A2 ..., AN and N number of second parameter B1, B2 ..., BN, wherein, Aj is first area Hj fitted ellipse parameter, Bj are that the fitting acquired in oval calculate is fitted to the pixel of first set and second set Elliptic parameter, the first set are the set of the partial pixel point, and the second set is removes institute in the second area State that partial pixel point is outer, the set of pixel corresponding to the hj of first area, the pixel corresponding to the first area hj with First area hj distance is less than the pixel and the distance of other first areas, 1≤j≤N, N >=2;
    If the distance of all pixels point of the second area and any first area is all higher than μ, the of the second area One parameter is sky, and the second parameter of the second area is to carry out fitted ellipse calculating to all pixels point of the second area Acquired fitted ellipse parameter.
  2. 2. the tracking based on Face Detection as claimed in claim 1, it is characterised in that the area of skin color by based on The skin color detection method of colour of skin model of ellipse obtains.
  3. 3. the tracking based on Face Detection as claimed in claim 2, it is characterised in that also include:
    Pass through formula P (s/c)=γ × P (s/c)+(1- γ) × Pw(s/c) colour of skin model of ellipse is updated;Wherein, s is defeated Enter the pixel value of the pixel of image, c is the pixel value of skin pixel point, and P (s/c) is the probability that the pixel is colour of skin point Value, Pw(s/c) it is to be through the probable value that the pixel obtained by colour of skin model of ellipse is colour of skin point, γ in continuous w two field pictures Sensitivity parameter.
  4. 4. the tracking based on Face Detection as claimed in claim 1, it is characterised in that it is described region is fitted it is ellipse It is based on determined by the pixel to the region asks for covariance matrix that circle, which calculates,.
  5. 5. the tracking based on Face Detection as claimed in claim 1, it is characterised in that
    The number of pixel corresponding to the first area closest with the second area is most, the first area Corresponding pixel is to be less than the picture with the distance of other first areas in the second area with the distance of the first area Vegetarian refreshments.
  6. 6. the tracking based on Face Detection as claimed in claim 1, it is characterised in that the value of the distance threshold μ Scope is the numerical value between 1~2.
  7. 7. the tracking based on Face Detection as claimed in claim 1, it is characterised in that the fitted ellipse parameter is also wrapped Include the long axis length, minor axis length and the anglec of rotation of fitted ellipse;
    Based on formulaThe pixel of second area and the distance of first area are calculated, wherein,P is the pixel of second area, and (x, y) is the coordinate value of p points, h the Fitted ellipse corresponding to one region, (xc, yc) for fitted ellipse central point coordinate value, specifically corresponding to first area Fitted ellipse central point coordinate value, α be fitted ellipse long axis length, β be fitted ellipse minor axis length, θ for intend Close the oval anglec of rotation.
  8. 8. the tracking based on Face Detection as claimed in claim 1, it is characterised in that also include:To continuous K frames After all second area tracking, if all pixels point of all second areas of the continuous K frames and same first area Distance is all higher than μ, then the fitted ellipse parameter of the first area is concentrated from the elliptic parameter and deleted, wherein, K value model Enclose for 5~20.
  9. 9. the tracking based on Face Detection as claimed in claim 1, it is characterised in that also include:By the oval ginseng The fitted ellipse parameter of first area corresponding with the second area is updated to the second parameter of the second area in manifold.
  10. 10. the tracking based on Face Detection as claimed in claim 1, it is characterised in that also include:
    Based on formula (xc+1, yc+1)=(xc, yc)+Δ c determines the coordinate value of the central point of the fitted ellipse corresponding to the 3rd region (xc+1, yc+1), the 3rd region is area of skin color corresponding with second area in next frame input picture;
    Wherein, Δ c=(xc, yc)-(xc-1, yc-1), (xc, yc) for fitted ellipse central point coordinate value, specifically the secondth area The coordinate value of the central point of fitted ellipse, (x in second parameter in domainc-1, yc-1) ellipse to be fitted in the first parameter of second area The coordinate value of round central point.
  11. A kind of 11. tracks of device based on Face Detection, it is characterised in that including:
    First acquisition unit, suitable for being fitted oval calculating respectively at least one first area, to obtain every one first area The fitted ellipse parameter in domain, the first area are the area of skin color in the first input picture;
    Second acquisition unit, suitable between the distance and distance threshold μ of the pixel based on second area and each first area Relation, obtains the first parameter and the second parameter of the second area, and the second area is the colour of skin in the second input picture Region, first parameter are the fitted ellipse parameter of first area corresponding to elliptic parameter is concentrated, and second parameter is base Acquired fitted ellipse parameter is calculated in the fitted ellipse carried out to the second area;
    Tracking cell, suitable for the first parameter and the second parameter based on the second area, the second area is tracked;
    Wherein,
    The fitted ellipse parameter includes the coordinate value of the central point of fitted ellipse;
    The elliptic parameter collection includes the set of the fitted ellipse parameter of each first area;
    The distance of the pixel and first area is the fitted ellipse that pixel concentrates the first area with the elliptic parameter The distance between central point;
    Relation between the distance and distance threshold μ of the pixel based on second area and each first area, described in acquisition The first parameter and the second parameter of second area include:
    The all pixels point of the second area and the distance of at least one first area be respectively less than μ, then the second area First parameter is the fitted ellipse ginseng of first area closest with the second area at least one first area Number, the second parameter of the second area are that all pixels point of the second area is carried out acquired in fitted ellipse calculating Fitted ellipse parameter;
    If the partial pixel point of the second area and N number of first area h1, h2 ..., hN distance be respectively less than μ, it is determined that institute State second area have N number of first parameter A1, A2 ..., AN and N number of second parameter B1, B2 ..., BN, wherein, Aj is first area Hj fitted ellipse parameter, Bj are that the fitting acquired in oval calculate is fitted to the pixel of first set and second set Elliptic parameter, the first set are the set of the partial pixel point, and the second set is removes institute in the second area State that partial pixel point is outer, the set of pixel corresponding to the hj of first area, the pixel corresponding to the first area hj with First area hj distance is less than the pixel and the distance of other first areas, 1≤j≤N, N >=2;
    If the distance of all pixels point of the second area and any first area is all higher than μ, the of the second area One parameter is sky, and the second parameter of the second area is to carry out fitted ellipse calculating to all pixels point of the second area Acquired fitted ellipse parameter.
  12. 12. the tracks of device based on Face Detection as claimed in claim 11, it is characterised in that also include:Updating block, fit In concentrating the fitted ellipse parameter of corresponding with second area first area to be updated to described second the elliptic parameter Second parameter in region.
  13. 13. the tracks of device based on Face Detection as claimed in claim 11, it is characterised in that also include:Predicting unit, fit In based on formula (xc+1, yc+1)=(xc, yc)+Δ c determines the coordinate value of the central point of the fitted ellipse corresponding to the 3rd region (xc+1, yc+1), the 3rd region is area of skin color corresponding with second area in next frame input picture;
    Wherein, Δ c=(xc, yc)-(xc-1, yc-1), (xc, yc) for fitted ellipse central point coordinate value, specifically the secondth area The coordinate value of the central point of fitted ellipse, (x in second parameter in domainc-1, yc-1) ellipse to be fitted in the first parameter of second area The coordinate value of round central point.
CN201310633324.4A 2013-11-29 2013-11-29 A kind of tracking and device based on Face Detection Active CN104680551B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310633324.4A CN104680551B (en) 2013-11-29 2013-11-29 A kind of tracking and device based on Face Detection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310633324.4A CN104680551B (en) 2013-11-29 2013-11-29 A kind of tracking and device based on Face Detection

Publications (2)

Publication Number Publication Date
CN104680551A CN104680551A (en) 2015-06-03
CN104680551B true CN104680551B (en) 2017-11-21

Family

ID=53315544

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310633324.4A Active CN104680551B (en) 2013-11-29 2013-11-29 A kind of tracking and device based on Face Detection

Country Status (1)

Country Link
CN (1) CN104680551B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108021881B (en) * 2017-12-01 2023-09-01 腾讯数码(天津)有限公司 Skin color segmentation method, device and storage medium
CN117095067B (en) * 2023-10-17 2024-02-02 山东虹纬纺织有限公司 Textile color difference detection method based on artificial intelligence

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1717695A (en) * 2002-11-29 2006-01-04 索尼英国有限公司 Face detection and tracking
CN101288103A (en) * 2005-08-18 2008-10-15 高通股份有限公司 Systems, methods, and apparatus for image processing, for color classification, and for skin color detection
CN101620673A (en) * 2009-06-18 2010-01-06 北京航空航天大学 Robust face detecting and tracking method
CN101699510A (en) * 2009-09-02 2010-04-28 北京科技大学 Particle filtering-based pupil tracking method in sight tracking system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2476397B (en) * 2008-10-15 2014-04-30 Spinella Ip Holdings Inc Digital processing method and system for determination of optical flow

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1717695A (en) * 2002-11-29 2006-01-04 索尼英国有限公司 Face detection and tracking
CN101288103A (en) * 2005-08-18 2008-10-15 高通股份有限公司 Systems, methods, and apparatus for image processing, for color classification, and for skin color detection
CN101620673A (en) * 2009-06-18 2010-01-06 北京航空航天大学 Robust face detecting and tracking method
CN101699510A (en) * 2009-09-02 2010-04-28 北京科技大学 Particle filtering-based pupil tracking method in sight tracking system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
一种新的基于直接最小二乘椭圆拟合的肤色检测方法;高建坡等;《信号处理》;20080430;第24卷(第2期);第2部分,第3部分 *

Also Published As

Publication number Publication date
CN104680551A (en) 2015-06-03

Similar Documents

Publication Publication Date Title
CN107767405B (en) Nuclear correlation filtering target tracking method fusing convolutional neural network
CN107169994B (en) Correlation filtering tracking method based on multi-feature fusion
CN102509074B (en) Target identification method and device
CN104835175B (en) Object detection method in a kind of nuclear environment of view-based access control model attention mechanism
CN106096577A (en) Target tracking system in a kind of photographic head distribution map and method for tracing
CN101281648A (en) Method for tracking dimension self-adaption video target with low complex degree
KR20130121202A (en) Method and apparatus for tracking object in image data, and storage medium storing the same
CN106127145A (en) Pupil diameter and tracking
CN105913028A (en) Face tracking method and face tracking device based on face++ platform
CN109767454A (en) Based on Space Time-frequency conspicuousness unmanned plane video moving object detection method
CN110349186B (en) Large-displacement motion optical flow calculation method based on depth matching
CN106778767B (en) Visual image feature extraction and matching method based on ORB and active vision
CN108364305A (en) Vehicle-mounted pick-up video target tracking method based on modified DSST
CN103632126A (en) Human face tracking method and device
CN109871829A (en) A kind of detection model training method and device based on deep learning
CN108805902A (en) A kind of space-time contextual target tracking of adaptive scale
CN104680122B (en) A kind of tracking and device based on Face Detection
CN106529441A (en) Fuzzy boundary fragmentation-based depth motion map human body action recognition method
CN104680551B (en) A kind of tracking and device based on Face Detection
CN105405152B (en) Adaptive scale method for tracking target based on structuring support vector machines
CN102509308A (en) Motion segmentation method based on mixtures-of-dynamic-textures-based spatiotemporal saliency detection
CN107590820A (en) A kind of object video method for tracing and its intelligent apparatus based on correlation filtering
CN116129386A (en) Method, system and computer readable medium for detecting a travelable region
CN103236053B (en) A kind of MOF method of moving object detection under mobile platform
CN110689559A (en) Visual target tracking method based on dense convolutional network characteristics

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant